The PunchPlatform is a big data platform with a strong emphasis on industrial deployment, end-to-end critical data processing, and data analytics applications. Although it is used in security platforms and monitoring systems, it has a wider scope of application because it provides a simple yet extremely robust data pipeline concept implemented on top of apache Kafka. You can design industrial data pipeline where you plug in your processing where you need, from simple data tagging to complex distributed machine learning algorithms. The platform and your pipelines are deployed in minutes, yet come in with integrated monitoring. From the start you understand your performance, you are ready for capacity planning and ready to seamlessly handle the likely growing of your applications.
One of its key difference with a simple ElasticSearch-Logstash-Kibana (ELK) setup is to let you deploy arbitrary processing in Storm and Spark engines, not just logstash filters. Because it has been used in various industrial applications worldwide, it now comes equipped with tenths of ready-to-use log parsers. These are deployed automatically in the stream of data.
Writing a parser on the PunchPlatform is easy. A powerful scripting language is available. However the ones provided by the platform are well-designed, cover most well-known equipment and systems, and come as modular functions you can deploy in a number of ways. Most importantly extra care has been taken on fields and data normalisation which in turn enable powerful search (using Kibana or the native ElasticSearch apis) but also to plug in machine learning processing by simple configuration.
In a nutshell : select your data fields, configure an arbitrary Spark pipeline, and start execute real machine learning processing on a production ready platform.
The punchplatform provides many additional functions such as multi-tenancy, long term archiving using CEPH object storage, multi site deployment. Feel free to visit the online documentation for an overview.