Automating Data Integration with Epic Using PowerShell
Epic is arguably the healthcare application in the United States. There are other systems that serve the same function. But let's face it, Epic holds a 51% margin as US healthcare organizations' choice for clinical systems. It is impossible to deny, if you are in the IT Healthcare space, you are interfacing with Epic.
One of the challenges Epic poses is its interdependency on third parties. Specifically, with one of our clients, there was a problem with the manner in which data moved between the Epic modules for Revenue Cycle and third parties who provided billing services to the organization.Before our engagement, there was no apparent standardization with which data was securely transferred to these various vendors.
Once engaged, our focus was providing a standard and secure means by which IT could control the transfer of data between internal entities and external entities. This came with the caveat of using SFTP transfer protocols as mandated by the IT Security team.
To facilitate the management of a heavily Windows-based IT organization, and meet the needs of the IT Security mandates, our primary focus became the development of a solution utilizing a combination of PowerShell, WinSCP, Windows Task Scheduler and an already in-place SFTP server.
The solution encompasses the internal Epic AIX server, the SFTP server, the SFTP gateway, and the external vendor servers. Data needed to flow in both directions (to and from Epic) for many of the external organization connections.
The solution has been broken up into multiple scripts that act as steps (and easily manageable breakpoints) for the transfer of data. The scripts run in Task Scheduler from the SFTP server.
Schedules were aligned with the completion of the private Epic Chron jobs and the external vendor's ability to consume the transferred data.
At the core, these scripts act as reusable templates which require minimal modification to facilitate new third parties as they are identified over time (as is often the case). The remainder of this post breaks down how the scripts work.
First, all of the scripts secure connection information (username, password) through native PowerShell encryption options. Each of the scripts gets an encrypted password file from a remote location and a corresponding AES key. Then error action preference is set to stop which causes the scripts to stop in the event an error is thrown.
The remainder of the scripts are written in a try, catch, finally block to perform error handling. Try sets the block of code to be executed. Catch tells the script what to do with the error(s). Finally, tells the script to run this section of code no matter what happens in the script.
Within the first try block, the WinSCP.NET Assemblies are added to automate the setup of a WinSCP session for the SFTP connection that will be established between the SFTP server and the third party. The WinSCP session options are set, including the Protocol to be used (SFTP by default), the Host to connect to (which, depending on the direction of data, is either the third party server or the Epic AIX server), the username and secure password to use, and the SSH fingerprint of the host.
An option for advanced session logging is commented out. If advanced logging is needed for troubleshooting purposes, the hash can be removed to produce a thorough session log the next time the script is run.
In a sub try block, the WinSCP session is started. Within the session, a file mask is set which is used to retrieve files of a specified type. In the case of all our scripts, the files to be returned are those that are less than or equal to the current date.
The files are then retrieved or sent in binary format through the SFTP protocol. This is done by providing the source folder location of the data, the destination folder location, setting an option of whether or not to delete the source after transfer, and then the transfer options to be used (the file mask).
The source files are then archived after confirming whether or not the files exist in the destination unless these are files being retrieved from the third party, in which case the management of those files are left to the third parties discretion. Logging of the success or failure is also established here to show whether or not the files were successfully uploaded/downloaded.
The final block is used to close the WinSCP session regardless of the outcome of the previous sections.
The catch block performs error handling. In the event of an error, an email is sent to the IT team with pertinent information about which script failed, when it failed, and what failed.
In summary, this post was meant to showcase, at a high level how PowerShell can be used as the core management of an automated data transfer solution. This solution was applied to Epic for our client, but it can also be used with any related file system structure where secure means of data transfer is needed. Also, it should be mentioned that with this structure, the client has been setup in a way that allows easy transfer of these scripted building blocks to an automation solution such as System Center Orchestrator should the customer purchase an enterprise automation application in the future.
Stay in the know by subscribing to our blogs!
Posted by Brandon Stephenson
Brandon Stephenson is the Infrastructure and Operations Practice Lead working out of KiZAN’s Cincinnati office. His team provides solutions focused on Azure Infrastructure, On-Prem Infrastructure, Automation, Identity, Identity Management, Business Continuity, Disaster Recovery, and IT Operations. He acts as an Architect and Technical Lead for Infrastructure solutions, and industry best practices to advise and transform customers with the way that they leverage their technologies to provide value to their customers.