Collect unlimited amounts of data generated from devices, machines and sensors. Apply stream processing for real-time alerting and automated analysis on enormous volumes of data. Gain unprecedented visibility for operational efficiency.
Intelligent edge software moves analytics away from centralized data centers. Analytics are performed on data locally to gain insights which can’t wait until data is transferred to a central location. Most edge analytics software is embedded inside of connected devices and nearby gateways. These device types are optimized for low power & cost and lack the capability to retain and perform powerful analytics on enormous volumes of data.
Raise the bar in-terms of the insights obtainable from intelligent edge software. Aggregate data from nearby gateways and devices to bring the power of big data analytics and real-time stream processing to an edge computing environment. Analyze massive amounts of data (hundreds of terabytes/petabytes), such as those generated by IoT, entirely at the edge – without having to ship data to a central location.
Collect unlimited amounts of data from device gateways, including programmable controllers and distributed control systems used within large industrial settings. Simple, flexible message-based data transfer ensures compatibility with all types of gateway devices, including those used within industrial environments. Data ingest scales upward to millions of inbound messages per second to ensure the collection of data generated by very large systems with IoT and other connected devices.
Automate the analysis of all incoming data with real-time stream processing. Stream processing examples include the ability to perform pre-aggregation, matching against blacklists, detecting anomalous conditions and identifying temporal patterns. Historical data remains local at the edge, allowing stream processing to combine stored data with incoming data to identify new insights that otherwise couldn’t be determined if data were retained for only several hours within a gateway device. There are no new programming languages to learn as stream processing leverages SQL syntax.
Operational technologists and data analysts require immediate access to machine and sensor data in order to make decisions that help improve efficiency, safety and revenue. The people who need access to such data often do not reside in the locations where machine and sensor data is collected. By aggregating and federating data at the edge, immediate access and reporting of all data can be achieved – in any location, country or continent. Insights and process improvements can be made on a local, regional and global scale – while analyzing the data from afar.
Cost effectively retain petabytes of globally distributed data at the edge for instant analysis. Retain massive amounts of data for as long as needed. Mix newly generated and historical data together to unlock new insights. All with the ease and familiarity of a SQL-based architecture that provides fast performance to any query type with standard interfaces to BI, DevOps and machine learning tools. There’s no need to rely on the cloud or other centralized big data architectures for powerful analytics when it can be done entirely at the edge, close to where enormous volumes of data are generated.