What are possible ways to share data between different companies

Company will receive data from its customer and send data back to its customer. The data is received from customer via excel files. Those files are non standard and contain real time data and lab data. It might/might not be associated with PI tags. Files come in various size and is irregular in frequency. Assume there are many more customers. What are best possible ways to exchange data between company and its customers? I know PI Cloud Connect is one option. PI Connector/Interface for UFL is another way. The objective is to develop a solution which is less intrusive to customers and much more efficient after receiving the files - reduce manual work with automatic update data into PI system with excel file (or some other ways). I think if a customer is willing to use PI Cloud Connect service, this is probably the best solution. What about those customers don't want to engage in PI Cloud Connect and still prefer to send data via excel unless we have a much better way to engage and empower them? The PI Connector/Interface for UFL is cumbersome in this case. It means that we need to write hundreds of ini files to handle non standard files.  It's very hard to ask customer sending standard excel files too since it will make customer to do more work than they currently do. What are other solutions so we can empower customers to onboarding their tags and data into company's PI System besides sending data via excel file? 

 

Appreciate your suggestions and comments.

  • Hi Kristy Wang:

     

    I agree PI Cloud Connect is far and away the best option. This option of course sets up direct data exchange between organizations in easily integrated data for your pi server and its corresponding tools.

     

    If this is not an option and you are wanting the least intrusive ability to share data, then having a shared folder location that is at least password protected is a safe and simple option. I believe the key is requiring strict adherence to a data model. Data sharing partners can upload CSV files in the format prescribed, and these can be easily parsed with their values written to pi tags using numerous tools - PowerShell is probably the one I would go with, but the UFL Interface would work great too. A variation of this option is to have a cloud based spreadsheet they utilize to load their data and you can access for downloading and / or parsing the data.

     

    Understanding you indicated requiring strict adherence to a data model might "kill the deal", so I might ask what is the value to them? If the value is high then requesting adherence to the data model might not seem that intrusive, especially if you deliver the tools for data entry for them. If this is not an option, I would consider some type of web form that loads their data to a SQL Server where the data can be fed to the PI Server utilizing a RDBMS PI Interface. This method controls the fields that receive their data.

  • Thanks James for your quick response. I think a web form to upload data into SQL server could be one of possible solutions. At least this is starting point to get leader to think about Pros and Cons. Again, thank you very much for your ideas!

  • Hi Tim,

     

    A VPN and PI-to-PI is one of possible solutions assuming some customers might adopt to this solution. This is much direct and clean way to exchange data between different PI System. 

     

    Thank you very much for your response. Awesome! Really enjoy PI Square Community's response.

  • There are two different scenarios to be addressed: both companies use PI or only one of the companies use PI; not going to address neither using PI since that is outside the scope of this site.

    So.. if both companies use PI, then all things being equal, PI Cloud Connect would be a wonderful solution. However if there are reasons that PI Cloud Connect cannot be used (price, overhead, regulatory, etc.), then other solutions need to be explored.

     

    To that extent, I'll detail 3 different solutions I've used in the past.

     

    1. Data is received via an e-mail message. I built an application to open the account inbox daily, look for the expected e-mail based on the sender, extracted the data (it was plain text data, but could just have easily been an attachment), parsed the data and imported it into the PI system. In this case, the e-mail format was highly unlikely to change; the vendor used this as a 'standard' interface to customers and changing the format would impact a number of customers.

    At the same time, there were external contacts that needed information send to them. A standard application was created to pull a list of tags by e-mail address and then the data via e-mail to the entities using plain text e-mail.

     

    2. Data was exported to another company using FTP, not fun, but it was what they requested. Again, an application was written to read a configuration file to determine what tags were required and what type of data: compressed archived values, interpolated, aggregate, etc., as well as the field format (00.000, 0.0, etc.). The application created text files in an outgoing directory. A service was written to FTP the files, then, read the directory of the receiver to ensure the files had in fact been sent; if they had been sent, they were moved to an archive directory. If they could not be confirmed as having been sent, they were left in the outgoing directory for future processing.

     

    3. VPN and PI-to-PI. A VPN connection was created with a partner and a PI-to-PI interface established to pull the data from their PI system to the receiving PI system.

  • Hi,

    Most of our customers use PI to PI solutions. I'd say this is the old way of doing things. If you want a standard solution that suits most cases I'd recommend going for an API approach. Some customers develop their own API platform while other just use PI Web API. 

     

    If you go for an API approach you end up with an platform independent solution.

  • To add to André's post, we also have customers sharing data using a 3rd party data sharing solution. License2Share | CGI.com is one popular in the Oil & Gas industry.

     

    For this we have set up PowerShell scripts which log into this service automatically throughout the day and downloads any new files available. Another set of scripts do the data processing on files once they're available inside a certain folder, before the UFL interface finally writes the values to the PI Data Archive.

  • In your last solution, you could also use the PI System Connector instead of the PI To PI Interface, couldn't you? The PI System Connector might be better in cases where both organizations use the PI Asset Framework.