Have you ever wondered how instant Coffee is made? In this blog, I’ll walk you through my Coffee Factory project which I created using Apama EPL without previous knowledge of the language. The project consists of two sub-projects, one of them is responsible for simulating the data within a Coffee Factory and the other one takes the role of Apama by analyzing the live data for any issues. If by any chance something wrong happens within our Coffee Factory, Apama takes care of the issue immediately and fixes it.… Read More
In Apama 10.3.1, we added support for HTTP redirects, cookie handling for requests, HTTP request decoding, and HTML form encoding to the HTTP client transport. We also added support for HTTP requests and HTML form decoding to the HTTP server transport.HTTP redirect support in the HTTP client transport
The HTTP client transport now supports HTTP redirects transparently. To enable this, set the followRedirects configuration option to true. If the URL you requested returns an HTTP redirect response, your request is made automatically on the URL to which you were redirected.… Read More
In Apama 10.3.1 we introduced the “Batch Accumulator” and the “Message List” codecs. Both of these codecs deal with handling batches of multiple events at the same time to support use cases with high event ratesThe “Batch Accumulator” codec
You can easily add this codec to your connectivity plug-in chain, without changing anything else in your application or anything external, and it will allow events being sent from the connectivity transport to a correlator (host) to be batched.… Read More
The purpose of this blog post is to introduce new features that extend the HTTP Server Connectivity Plug-in with the ability to respond to a request from the EPL. Prior to 10.3.1 only a basic response was returned to indicate the call had succeeded. This change allows a full request and response protocol to be implemented.
Now, you can configure the HTTP Server transport to allow the response to an HTTP request to be created by an EPL application, instead of automatically returning an empty accepted response.… Read More
My goal with this blog post is to show how to set up Apama with Kafka and how they can be used together in a stream processing application. Kafka handles the transport and allows you to set up the delivery system, while Apama is a high performance event-processing engine with enough flexibility and throughput to provide for most applications. This blog post isn’t going to be a tutorial on Kafka, and I have used a containerized version of a Kafka cluster for simplicity.… Read More
Apama uses the concept of “bundles” to provide the ability to add connectivity such as Kafka or MQTT to the project, and also for adding EPL capabilities such as date/time formatting. Until now, the only way to create a project and add these bundles was using the Software AG Designer graphical environment, which is only supported on Windows.
However, now we have a new tool called apama_project that enables all this functionality from the command line, on Linux as well as Windows.… Read More