Skip to main content

· 3 min read
Pete Goddard

Remember when Egon Spengler had a radical idea? We’ve got to cross the streams, he said. It was the only way to win. Well, in our world we’re wrangling data, not ghosts, but the same concept holds true. When tackling complex data problems especially, crossing the streams is not only good, but necessary.

img

· 4 min read
Pete Goddard

Deephaven Data Labs, a high performance time-series database, today announced the release of Deephaven Community Core, a free, real-time analytics data engine with relational database features. Now available to the open source community, Deephaven Community Core empowers data developers and data scientists to access and apply dynamic and real-time data capabilities to help solve their big data use cases that drive analytics and applications.

· 13 min read
Nathaniel Bauernfeind

A devops dissection of Deephaven Community Core

Typically, when you start Deephaven, you would go through these steps, which use Docker and Docker Compose to set up a Deephaven server and its dependencies.

However, running inside of Docker is not the only way you can set up Deephaven. Today, I'm going to walk through the steps necessary to get a Deephaven Community Core server running on a fresh install of Linux.

· 3 min read
Pete Goddard

It's natural to frame new information in the context of what you already know and understand. The same is true when determining where new vendors and technology/solution providers fit into established marketplaces. Deephaven enters an arena defined by Kafka, Spark, Influx, Redshift, BigQuery, Snowflake, Postgres, and dozens of other players.