confluent has just announced a new $ 250 million shot in series capital, nearly doubling its total funding to $ 456 million, and its valuation to $ 4.5 billion. It’s likely that the new funding was probably the world’s worst-kept secret, given one Bloomberg report last month reveals the likelihood of an impending round.
While most later venture rounds like Confluents typically funnel more resources into the “go to market” page with sales, marketing and customer support, Confluent puts a stake in the new round of product development, specifically around “Project Metamorphosis”, which is aimed at by building event streaming capabilities in Confluent Cloud.
Funding is a year in which annual recurring revenue roughly doubles, and Confluent’s managed cloud service revenue more than quadruples. The company also stepped in formal partnerships with Google Cloud and AWS to host its managed service and last fall became available through Azure marketplace. With Confluent Managed Cloud on each of the major cloud platforms, Confluent also advertises it as a multi-cloud platform that can act as a PubSub bridge spanning each of the clouds.
And a few months back, Confluent was released version 5.4 of its business platform, which concerned four key areas. Among them was a new one multi-region cluster function that supports asynchronous replication across different data centers to promote more reliable failover. Security that increases with role-based access control instead of the original access control, lists plus new higher-level structured audit logs and replaces a more primitive design with a format readable by Elasticsearch and Splunk, reflects Confluent’s goal of elevating its Kafka environment from its origins as a low-level tool for developers to building PubSub data pipelines. Finally, schema validation performed at the topic level inside the Kafka broker, rather than in an external schema event, should hopefully harden the schema and result in fewer corrupted applications.
Confluent reaches beyond its core audience of Java and Python developers to engage the SQL professional base by expanding KSQL, its ability to run SQL queries against streaming data, with a preview of a new database streaming database ksqlDB. It expands the footprint from streaming data to cache data at rest, and in many ways mimics the role of memory databases. With persistent data comes the opportunity to build materialized views that can provide more focused queries and analytics capabilities on data that would normally be moving. One would imagine that you could get the same capability by attaching a database in memory, but Confluent promotes its alternative as providing a simpler architecture.
And this is where project metamorphosis begins. Admittedly, it’s a play on words and a bit of a piece about it. Project metamorphosis is supposed to transform Kafka streaming into a cloud-native service that works like glue with on-premise data sources. Or, more specifically, develop the capabilities of Kafka, which treat the origin of streams as events, which can then be analyzed, manipulated and questioned. Keep this thought in mind as it will be the central theme of Confluent’s technology roadmap for the rest of the year.
Project metamorphosis was named after one Franz Kafka tale about a poor guy who, after a rough night, turns into an insect, becomes extinct and then reluctantly accepted by his family and dies months later. You can always rely on the open source crowd to tap on obscure references to make their projects memorable – though choosing this one has positively made us scratch our heads a bit.
Throughout the year, Confluent raises a wave of monthly announcements for its Serpentine Project Metamorphosis initiative, which Press release You describe as “event streaming’s next act.” As Confluent drew a quarter of a billion dollars on this vision, we assume the plan is for results that will be much less sinister.