Announcing PhotonIQ: The New AI CDN For Accelerating Apps, APIs, Websites and Services
Tutorial

Announcing Stream Workers: Complex Event Processing And Streaming Analytics At The Edge

Post Image

We’re excited to announce the general availability of Macrometa Stream Workers, Macrometa’s complex event processing (CEP) and streaming analytics engine. 

Complex event processing is a tricky topic, made simple by Macrometa’s Global Data Network. Macrometa recently held a webinar giving an overview of CEP and the distinction between CEP and streaming analytics. 

CEP allows you to work with one or more data streams, optionally combined with data-at-rest, to infer “complex events”. For example, you might have streaming financial trade data, and by joining that with other data about macroeconomic conditions, you could infer a “market correction” event.

Meet Macrometa Stream Workers

Stream Workers provides typical streaming analytics capabilities, including joining streams, aggregations, counts, and filtering as well as sophisticated high-level primitives for complex event processing such as pattern recognition, conditional logic, and flow control. The Stream Workers engine runs within the Global Data Mesh, allowing you to incorporate operational data-at-rest with streams of data from both Macrometa’s streaming collections and other event streams, such as from Apache Kafka topics. Stream Workers can write to data stores within the Global Data Mesh or stream results to other streaming frameworks like Kafka. In addition, Stream Worker's applications can be queried directly using a REST API. Since Stream Workers is tightly integrated with the Global Data Mesh, there is no need to transfer data from your operational data store to the CEP engine. And they are, ultimately, making large amounts of reference data available instantly for sophisticated predictive modeling at very low latency.

This may sound very complicated and, until now, it has been, but with Macrometa Stream Workers, all developers need is basic SQL and Javascript skills, and the Stream Workers engine handles the rest. In addition, your application can run simultaneously at over 175 locations worldwide and synchronize automatically at the speed of light. As a result, companies and developers using the Macrometa Global Data Network have a real competitive advantage with access to a powerful distributed stream processing engine without the developer and administration complexity you’d expect.

What can you build with Stream Workers?

Stream Workers can write self-contained microservices and traditional streaming analytics applications. Using Stream Workers, you can create REST APIs, create microservices, and serve the logic of an application backend. To demonstrate this, we’ll show our cryptocurrency trading demo below. Stream Workers can solve so many problems, but here are some simple use cases:

Data Transformation

Transform data from one format to another, for example, from XML to JSON, and clean and filter your data:

  • Extract, transform, and load (ETL) data from log files and scrape HTTP endpoints.
  • Obfuscate sensitive information by filtering it from messages, logs, or other data streams that may include Personally Identifiable Information (PII) or Personal Health Information (PHI).
  • Correlate data by joining multiple streams to create an aggregate stream.

Complex Event Processing

Derive insights by identifying event patterns in data streams:

  • Summarize data with time windows and incremental aggregations
  • Trigger actions based on streaming data, either as a single service request or a complex enterprise integration flow.
  • Build and refine predictive models for machine learning

Machine Learning Model Serving

Build, refine, and serve machine learning models:

  • Make real-time offers to customers who are visiting an eCommerce site based on browsing or shopping history
  • Threat detection in cybersecurity applications
  • Predictive maintenance use cases

Stream Workers Demo: High-Frequency Crypto Trading App

In this demo application, we build a crypto arbitrage trading bot to trade Bitcoin locally in US, Germany, and Japan exchanges.

An exchange arbitrage is a trading strategy based on the differences between the price of the cryptocurrency at different exchanges. Generally, these opportunities open up if there is a price discrepancy, and the contrast accumulates over time to finally become a significant amount, generally 1–3%. However, these arbitrage opportunities typically only exist for seconds, so an arbitrage trader must constantly search for the best opportunities and implement them immediately.

Implementing arbitrage strategies in the crypto market takes work and involves several technical complexities. Besides having deposits in multiple major exchanges, on the technical side, it consists of the ability to monitor arbitrage opportunities simultaneously, making trades locally at each exchange with minimum delay and timely visibility and interactions between the trading agents distributed globally. The challenge requires big dollars and, until now, a team with the technical expertise to build a distributed infrastructure.

To build the demo, we used the following:

  1. Macrometa Global Data Mesh collections, both streaming and data-at-rest, and Macrometa Edge Compute’s Stream Workers CEP engine.
  2. We use a document store to store trade data and share all trade-related metadata and state between the decentralized, stateless bots and components.
  3. We use geo-distributed streams for publishing real-time price data from each exchange. Each bot subscribes to the stream and decides on whether to buy or sell.
  4. Lastly, use real-time updates from the datastore to notify the clients/bots/subscribers of changes to the database so they can modify their trading strategy.

The entire backend of this application is written in SQL and Javascript with only 174 lines of code! Watch the video for a complete app walkthrough, or check the code on GitHub.

The Stream Workers Developer Experience

Stream Workers logic is written in standard Postgres SQL and Javascript. Our web console has a full-featured REST API, code editor, and publisher. You can query any data store directly in your code without any configuration.

We provide native sources and sinks to connect to Kafka, Google Pub/Sub, MQTT, and S3.

Image

The Stream Workers code editor

After you’ve written your code and are ready to save the Stream Worker, click save, and pick the regions to deploy your Stream Worker.

Image

Create Stream Worker

When you’re ready to go live, click Publish. You don’t need to manage any software images or configuration; we get your app up and running in a few seconds. Macrometa’s intelligent traffic routing means every time a user or client sends data to the GDN, and it’s routed to the region closest to them. That means your applications will be FAST.

We provide several sample Stream Workers samples in the web console to get started quickly.

Image

Conclusion

We think Stream Workers is ushering in the next era of developing applications at the edge, but don’t just take our word for it. Instead, read through our documentation to see all the available APIs, and request a trial to try them out. We can’t wait to see what you build.

Further reading:

The Guide to Event Stream Processing

Complex Event Processing with Macrometa

Choosing a Complex Engine Process engine

Implement Complex Event Processing in Minutes (Macrometa tutorial)

Using Macrometa Stream Workers In A Real-Time Log Analytics Dashboard

Join our newsletter and get the latest posts to your inbox
Form loading...





Featured Posts

Related Posts

Recent Posts

Platform

PhotonIQ
Join the Newsletter