Blogs

How to Migrate from Lotus Notes to Office 365

Due to immense popularity of cloud based storage that Office 365 offers, more and more users are migrating to this platform from IBM Lotus Notes. Complex management of the Lotus Domino could be another reason behind the Lotus Notes to Office 365 migration. Some of the reasons that are responsible behind switching the email platforms could be:

  •  Ever rising maintenance cost of Domino Server
  •  Technical expertise required to operate Lotus Notes
  •  Comprehensive online access to data in Exchange and less data transfer time
  •  Cost Effectiveness of Exchange environment
  •  Optimized features of Exchange and client applications

Apart from the above mentioned reasons, there are other causes as well that compels organizations to move to Office 365 from Lotus Domino platform. Migration can be performed by following the given procedure.

Procedure to Migrate NSF Mailboxes to Office 365 Environment

Step 1: Create Backup

Before initiating the migration procedure, make sure to backup NSF files using IMAP Server. This is important as manual procedures are quite risky and can lead to data loss.

Step 2: Create New Mailboxes

To migrate Lotus Notes to Office 365, next step involves creating new mailboxes for all Lotus Notes user profiles.

Step 3: Enable IMAP

  1. Select ‘Domino Administration’ and then click on ‘Configuration’. Next, open Server files that runs IMAP service
  2.  Then, enable the default TCP/ IP port
  3.  Change the port values
  4. Next, set given fields as ‘Enable’ right from the mail IMAP column
  5. Click on ‘Save’, then ‘Close’ and then ‘Exit’

Step 4: Connect to IMAP Connector

For synchronizing the email messages from NSF to Office 365 platform, you can use the IMAP connectors.

The above mentioned procedure helps accomplishing Lotus Notes to Office 365 migration without implementing any third party software apps. But, the entire procedure is not much reliable as the mailbox data can be put on stake.

Limitations Associated to Manual NSF to Office 365 Migration

There are several limitations associated with migration of mailboxes from Lotus Notes NSF platform to Office 365 manual procedure. 

  • Make sure that the NSF client enables to connect to the internet if it is behind the proxy settings or firewalls
  • IMAP connectors does not allow to transfer calendar entities; therefore, they have to be transferred manually
  • Transferring Calendar entities manually may consume a lot of time
  • Using IMAP connectors, maximum to 1GB of data can be migrated to Office 365 platform
  • With IMAP Connectors, the offline Lotus Notes data cannot be migrated as it allows moving only the online data to Office 365
  • If you need to migrate Lotus Notes to Office 365 manually, you need to possess a sound technical expertise for execution of entire procedure

However, automatic migration mitigates risks while reducing the negative consequences of Lotus Notes to Exchange transformation. Moreover, the auto migration procedures not just ensure data integrity and security, but also make it possible to handle migration via single management console.

Stellar Phoenix NSF to PST Technician version helps migrating Lotus Notes files to Office 365 supported file format. Apart from this, it also allows accessing NSF files in MS Outlook, the popular email client. Moreover, the files are stored in multiple file formats that include MSG, EML RTF, HTML and PDF formats.

While MSG and EML allows accessing email messages in single file format, PDF and RTF allows accessing the files independently without installing any dedicated platform. These files formats offer extended accessibility to NSF mailboxes as they can be opened in multiple email clients (both web based and desktop based).

Be the first to comment

Building All-In-One uninstaller packages.

I recently embarked on looking for a way to uninstall all versions of QuickTime with the latest security issues.  I searched IT Ninja, Dell Software Forums, and Google trying to find a simplified way to build an all-in-one uninstaller in K1000.  Going version to version just took way to long.  I stumbled across a field in the software inventory that caught my attention.  Here is what I did to make this work:

Go into Inventory > Software.
Click Choose Action > New.

Name the software something like "QuickTime AIO"
Choose the Operating Systems you want to target (Control + Click for multiple).

The magic happens in the next area: Custom Inventory Rule
Note: Click on the blue ? icon to see all of the strings available.
For this example I chose to use the QuickTime executable file.  So I entered "FileExists(c:\program files (x86)\quicktime\quicktimeplayer.exe" into the Custom Inventory Rule box.

V1jXkh.png

Check the box "Don't Replicate Associated File".

Save the software item and force all of your devices to inventory.  

Now you can select this new software item in your Distribution > Managed Installations area.  

I used the setup below to build the uninstaller.

wofKIm.png

The Full Command Line is:  wmic product where "name like 'quicktime%'" call uninstall /nointeractive

Let me know if you have any questions regarding this.  Also if you have any feedback or suggestions please advise.


Be the first to comment

Faster, Smarter DevOps

Call it DevOps or not, if you are concerned about releasing more code faster and at a higher quality, the resulting software delivery chain and process will look and smell like DevOps. But for existing development teams, no matter what the velocity objective is, getting from here to there is not something that can be done without a plan.

Moving your release cadence from months to weeks is not just about learning Agile practices and getting some automation tools. It involves people, tooling and a transition plan. I will discuss some of the benefits and approaches to getting there.

Waterfall to Agile, Agile to Continuous Integration, Continuous Integration to Continuous Deployment. Whatever your processes are, the theme is the same: find a way to get code to users faster without sacrificing quality. But speed and quality are sometimes at opposition to each other. Going faster means things can break faster, and when we only think about DevOps as releases, it’s easy to fall into this trap.

1*rySq9stz49paDvvLwEzLhg.jpeg

Established development shops cannot just jump from one flow to another. Unless you start out net new, the goal is to introduce new processes without delaying releases for three months or more to do transition in lump. This is often done using a pincer approach that addresses both bottom-up tactics and top-down oversight and culture at the same time.

However, because adopting DevOps tools is so easy, the trend is to focus on tactics only and adopt from the bottom up without consideration of the entire pipeline. The outcome is release automation tools that dictate your delivery chain for you, and not the other way around. Here are the key categories that get neglected when teams hit the accelerator without a plan in place.

Structured Automation: DevOps requires automation. But what is often not considered is automation that sustains and fits into the entire delivery chain. Considerations such as governance, artifacts organization and inventory, metrics and security need to be made. If an organization establishes a vetting process for all new automation and how it fits into the pipeline’s orchestration, then new automation will support what exists today and in the future.

For example, many organizations driving down the DevOps path have encountered challenges when trying to incorporate practices from security or governance teams. Historically these teams have resided outside of the dev and ops echo chamber and their processes were asynchronously aligned to the work being done. The challenge for many organizations is to determine the best ways to bring the people, processes, and technology supporting these initiatives into the fold without slowing things down. The best organizations are finding new ways to automate policies from security and governance teams by shifting away from artisanal, asynchronous approaches to synchronous processes earlier in the lifecycle.

Let’s take a look at an example of application security. A number of technology vendors in the application security arena are touting “automation” as key value point for their solutions in order to better fit them into a DevOps tool chain. In some instances, automation means that machines are now fully responsible for monitoring, analyzing, and fixing security vulnerabilities for those applications at wire-speed. In other instances, automation implies facilitation of human-centric workflows that might represent hours or days of asynchronous analysis not fit for continuous operations. In both cases, the technologies may accomplish similar ends, but their approaches could be dramatically different.

Also, one solution might be built to support asynchronous investigations by a security professional, while the other might provide synchronous support to a developer at the design and build stages of the SDLC. Establishing a vetting process can help determine if the automation levels required by a team or process can truly be delivered before investments are made. It is also worth noting that layers of obscurity frequently exist within words like “automation”, “integration”, “configuration”, and “continuous”.

1*qk2i46FT9JElqnoJKpnNfg.jpeg

For the complete story, please continue to InfoQ http://www.infoq.com/articles/faster-smarter-devops.

Be the first to comment

Using KACE K1000 Device Actions With Any Windows Browser

The Dell KACE K1000 Systems Management appliance offers the option to define custom device actions that will start a remote session or execute custom commands like ping etc. Most of these actions only work with Internet Explorer since they use ActiveX.


This article describes a custom solution that can launch the device actions from the most common browsers available for Microsoft Windows.

The download is available here Download, it is a KACE K1000 script (6.4 SP2) for Windows that can be imported into a K1000 appliance. Download the script and copy it to the clientdrop share of the K1000 and import it into the K1000. From there the script can be deployed to all the Windows computers from where the device action(s) shall be executed.

If a dedicated Remote Control Software like UltraVNC Viewer is required on those PCs that Software must be deployed separately. Built in commands like mstsc, ping etc. do not require a separate installation.


This is an example of a custom action for the UltraVNC Viewer in the K1000:

K1000DeviceAction://C:/Program Files (x86)/uvnc bvba/UltraVNC/vncviewer@exe@KACE_HOST_IP


K1000DeviceAction:// is a pseudo protocol being created by the script on the computers the script gets deployed to. It makes the browser accept custom commands since it simulates a custom protocol.

C:/Program Files (x86)/uvnc bvba/UltraVNC/vncviewer is the path to the UltraVNC Viewer including the filename of the file to execute without the extension .exe. Make sure to use / instead of \ in the path, otherwise the action will not work.

exe is the Extension of the file to execute

KACE_HOST_IP is a K1000 device action variable for the remote computer's IP address

@ is a separation character allowing the script to generate the command to execute.


The common pattern for the action is: K1000DeviceAction://Path/FileToExecute@FileType@Parameters,  any other device action can be created using this pattern.

An action running a ping looks like this: K1000DeviceAction://cmd@exe@/c ping KACE_HOST_IP


On the computers where the device action shall be executed popups must be allowed for the K1000 Website in the browser settings.

The solution has been tested with Windows 7, 8, 8.1 and 10 running MS Internet Explorer, Microsoft Edge, Google Chrome and Firefox.


Here are some screenshots showing how it looks like running the action from  Microsoft Edge:

    






NOTE: This is a custom solution that is not officially supported by Dell Software Support !


View comments (1)

Entity Framework Core Support in New Versions of Devart ADO.NET Providers

Devart announced the new versions of our ADO.NET providers - enhanced database connectivity solutions for databases and cloud applications, built over the ADO.NET architecture, that support Entity Framework and LinqConnect ORM solutions. New versions of dotConnect data providers offer you support for Entity Framework Core RC1.

Entity Framework Core Support

Entity Framework Core (formerly, Entity Framework 7) is a new version or Microsoft widely used ORM, which is completely redesigned and intended to be a multi-platform and more lightweight ORM solution that can be used in traditional .NET scenarios, in cloud, on devices, etc.

Currently Entity Framework Core support in our providers is implemented only for the full .NET Framework platform .NET Framework 4.5.1 and higher.

Since Entity Framework Core is a completely redesigned ORM, which shares little but a name and LINQ support with previous Entity Framework versions, there are a lot of changes in our Entity Framework providers for Entity Framework Core. You can read more about these changes in our blog articles: Entity Framework Core 1 (Entity Framework 7) Support and Migrating Entity Framework 6 projects to Entity Framework Core 1 (Entity Framework 7).

To learn more about updated Devart ADO.NET Data Providers, please visit https://www.devart.com/dotconnect/ 

dotConnect for Oracle New Features

The new version of dotConnect for Oracle contains a number of improvements. Here are main of them:

OracleLoader Improvements

Now OracleLoader has the Options property, which allows you to configure data loading behavior: keep constraints enabled during loading data, disable table logging, indexes, and triggers, use its own transaction for loading data, etc. New LoadTable method overloads allow you to load rows from a DataReader, array of DataRow objects or only rows with specific RowState from a DataTable. OracleLoader also now can load data to a specific partition of the partitioned table. A new RowsCopied event allows tracking data loading progress and correct abortion of the operation when necessary.

Performance Counters Support

dotConnect for Oracle now supports performance counters, providing various information about active/inactive/pooled connections, connects per second, etc. You can view this information in Windows Performance Monitor or access it programmatically. Note that they count only connections with the Use Performance Monitor connection string parameter set to true, and there should be at least one application with such connection running in order to see these counters in Windows Performance Monitor.

Transaction Guard Support

New version of dotConnect for Oracle provides advanced support for Transaction Guard with the new OracleLogicalTransaction class and OracleLogicalTransaction property of the OracleConnection class.

Source Edition

Now you can get source access to the OCI mode implementation of all the dotConnect for Oracle runtime classes by purchasing a license for the Professional Edition with Source Code. It has the same features as the Professional Edition and includes the source code for most of the runtime classes and features. Note that it does not include sources for design-time features, like Entity Developer or Package Wizard, and source code for the Direct mode comes in the obfuscated form.

Other Improvements

  • HA (High Availability) Events support
  • Improved OracleLob class
  • Improved OracleXML class
  • Improved OracleString class
  • Improved OracleRef class
  • Improved OracleString class
  • Improved OracleCursor class
  • Improved other structures, representing Oracle data types


To learn more about updated new dotConnect ADO.NET Data Provider for Oracle, please visit https://www.devart.com/dotconnect/oracle/ 

Be the first to comment
Showing 1 - 5 of 2937 results