ELK Stack for Threat Hunting?
Mitigating cyber threats is one of the challenging projects for organizations on their digital journey. Today’s threat landscape is rapidly evolving and hackers continue to innovate.
The more technologies are used the more companies turn into worthwhile targets.
Perimeter defense and reactive security strategies are no longer sufficient. It is just a matter of time when an organization’s critical (IT) infrastructure will be compromised.
Threat hunting is a proactive cyber defense process of iteratively searching through endpoints and networks to detect advanced threats that evade existing security solutions.
With the Internet-of-Things in mind and connectivity approaches such as connected vehicles, we have to broaden our perspective and include devices, machines, sensors and more.
Threat hunting describes the opposite of traditional evidence-based security measures that start data investigation after there has been an alert or warning. Traditional solutions are Intrusion Detection Systems, Malware Sandboxes or Security Incident Event Management Systems. AFAIK, none of these existing security solutions is IoT ready.
Modern threat hunting is a machine-assisted process, where machine intelligence augments human experience & knowledge to identify potential risks and formulate analytics-driven hunting hypotheses.
Big Data is the Minimum
A constantly increasing amount of security events and requirements to investigate longer lasting log histories to detect traces of Advanced Persistent Threats (APTs) turns today’s threat hunting (like cyber defense) into a big data rally.
Fast Data is the Must-Have
Reducing infiltration time is critical to mitigate risks, attack-based production downtimes, stolen intellectual property, reputational damage and more. Finding unknown indicators of compromise or traces of advanced threat patterns is like finding the needle in the haystack.
It is about fast recurring cycles of pattern detecting to minimize infiltration time.
With this in mind, fast data technologies, where petabytes meet (near) real-time, are vital for proactive cyber defense approaches.
ELK Stack for Threat Hunting?
The Elastic Stack, i.e. Elasticsearch, Logstash, Kibana and its associated family of Beats is a popular open source stack for all kinds of modern data analytics. It is no surprise, that we observe approaches to leverage the popular ELK Stack for analytics-driven threat hunting.
Elastic as the heart of an open source threat hunting platform (we talk about a search engine) raises (at least) some key questions:
- Is Elastic data processing big enough to persist huge amounts of longer lasting log histories, say months.
- Is Elastic data processing fast enough (compared to in-memory data platforms) to minimize infiltration times?
- Is Elastic data processing flexible enough to join & aggregate search silos (indices) compared to fast SQL big databases?
So, if you doubt whether all questions can be answered with yes, there is an open source alternative. And, it is an alternative, that is not restricted to cyber defense, but applicable to other data-intensive application areas.
Hunting in Endpoint Logs
Let’s take a look into an example to sketch open source alternatives for the popular ELK Stack: threat hunting in endpoint logs.
Osquery is an OS instrumentation framework for Windows, OS X (macOS), Linux, and FreeBSD. It exposes an OS as a high-performance relational database. Running processes, loaded kernel modules, open network connections, browser plugins, hardware events or file hashes are represented as SQL tables and can be explored by writing SQL queries.
Osquery can be configured to directly publish endpoint logs an Apache Kafka topic.
Kolide Fleet is the most widely used open source osquery Fleet manager.
Leveraging Fleets supports remote deployment of osquery, enables live queries, and management of osquery-controlled endpoints.
Having externalized endpoint log events as Kafka topics opens the door to more advanced analytics. Analytics does not have to be immediately equated with deep learning or machine learning.
ATT & CK based SQL Queries
MITRE developed a knowledge base of SQL queries, based on the Adversary Tactics, Technique and Common Knowledge (ATT & CK) adversary model.
These queries can either be directly applied to Kafka’s real-time event stream or after having events persisted in an appropriate time series database. Let’s start with SQL queries on Kafka event streams.
Why not leverage Confluent’s KSQL? Confluent KSQL is the streaming SQL engine that enables real-time data processing against Apache Kafka. It provides an easy-to-use, yet powerful interactive SQL interface for stream processing on Kafka, without the need to write any line of code.
Sounds great and is great, when data computing is limited to just this use case. Otherwise you have to operate another platform and another one. More technology, more interfaces and increasing efforts to manage and operate such a multi-platform environment.
Full-Spectrum Machine Intelligence
Comprehensive data processing like threat hunting is never restricted to a single operation. Finding the needle in the haystack requires to combine many different data computing approaches that cover the full spectrum of machine intelligence.
This sample setup presents an end-to-end approach how to combine an in-memory data processing platform with full-spectrum machine intelligence for threat hunting:
Apache Ignite is a high-performance, integrated and distributed in-memory platform for computing and transacting on large-scale data sets in real-time, orders of magnitude faster than possible with traditional disk or flash-based technologies.
Apache Ignite comes with ANSI-99 compliant, horizontally scalable and fault-tolerant distributed SQL database. The distribution is provided either by partitioning the data across cluster nodes or by full replication.
PredictiveWorks is an open-source code-free data integration & analytics platform, based on the core technology of Google’s Cloud Data Fusion service.
It offers 200+ plugins, organized as data connectors ( Works Connect) and data analytics plugins ( Works DL, Works ML and more), that can be orchestrated to respond to all kinds of data processing use cases with point-and-click.
CrateDB is a SQL database at IoT-scale amounts of data. In the sample approach above, Crate is presented as an integrated persistence layer for Apache Ignite mediated by PredictiveWorks.
CrateDB ships with a Postgres-compliant access layer and integrates with Grafana for time series visualization and Tableau for business intelligence purposes.
Data Enrichment & Incident Management
Is something missing? Right. The sample end-to-end setup seems to ignore access to threat intelligence information to determine whether machine-generated threat signals look like known threats.
PredictiveWorks current offering of connectors includes connectors to ThreatConnect API to include known indicators, and TheHive (an open-source SIEM for the masses), to enable security analysts to carry out further analysis.
Independent of not showing these relevant components as well, we hope that this article is helpful for those that search for an alternative to the popular ELK Stack.
An alternative that is appropriate also to integrate further real-time events, say telemetry events from devices and machines that originate from the IoT domain, It is just a matter of leveraging another connector.
Originally published at https://www.linkedin.com.