How to use Deephaven in a local development environment
Set up your IDE
Note
Download Example projects that depend on Deephaven are available for the Gradle and Maven build tools:
This example will use IntelliJ IDEA together with the Gradle build tool and show you how to create a project that uses Deephaven libraries.
Your new project will include a build.gradle
file that looks something like this:
plugins {
id 'groovy'
id 'java-library'
}
group 'org.example'
version '1.0-SNAPSHOT'
repositories {
mavenCentral()
}
dependencies {
implementation 'org.codehaus.groovy:groovy-all:3.0.9'
testImplementation platform('org.junit:junit-bom:5.10.0')
testImplementation 'org.junit.jupiter:junit-jupiter'
}
test {
useJUnitPlatform()
}
To add Deephaven libraries to your project, add the following to your dependencies:
def dhcVersion = '0.37.4'
def dheVersion = '1.20240517.344'
implementation "io.deephaven:deephaven-engine-table:$dhcVersion"
implementation "io.deephaven:deephaven-extensions-csv:$dhcVersion"
implementation "io.deephaven:deephaven-extensions-parquet-table:$dhcVersion"
implementation "io.deephaven:deephaven-engine-test-utils:$dhcVersion"
implementation "io.deephaven.dnd:Database:$dhcVersion-$dheVersion"
testImplementation 'org.mockito:mockito-core:4.5.1'
This adds fundamental Core+ classes like Table and Database to your project, as well as tools to read CSV and Parquet files. Mockito will be used to mock the Database in unit tests.
Next, add the following Maven repository with your credentials. artifactoryUser
and artifactoryAPIKey
must be set in gradle.properties
:
maven {
credentials {
username = artifactoryUser ?: System.getProperty('user.name')
password = artifactoryAPIKey ?: ""
}
url 'https://illumon.jfrog.io/illumon/libs-customer'
}
The Core+ Database module also requires the Confluent Maven repository:
maven {
url 'https://packages.confluent.io/maven'
content {
includeGroup 'io.confluent'
includeGroup 'org.apache.kafka'
}
}
Deephaven logs JVM internal stats that require the following JVM argument to allow access: --add-exports=java.management/sun.management=ALL-UNNAMED
. Add this to your test task.
Your build.gradle
file will now look something like:
plugins {
id 'groovy'
id 'java'
}
group = 'org.example'
version = '1.0-SNAPSHOT'
repositories {
mavenCentral()
maven {
credentials {
username = artifactoryUser ?: System.getProperty('user.name')
password = artifactoryAPIKey ?: ""
}
url 'https://illumon.jfrog.io/illumon/libs-customer'
}
maven {
url 'https://packages.confluent.io/maven'
content {
includeGroup 'io.confluent'
includeGroup 'org.apache.kafka'
}
}
}
dependencies {
def dhcVersion = '0.37.4'
def dheVersion = '1.20240517.344'
implementation "io.deephaven:deephaven-engine-table:$dhcVersion"
implementation "io.deephaven:deephaven-extensions-csv:$dhcVersion"
implementation "io.deephaven:deephaven-extensions-parquet-table:$dhcVersion"
implementation "io.deephaven:deephaven-engine-test-utils:$dhcVersion"
implementation "io.deephaven.dnd:Database:$dhcVersion-$dheVersion"
implementation 'org.codehaus.groovy:groovy-all:3.0.9'
testImplementation platform('org.junit:junit-bom:5.10.0')
testImplementation 'org.junit.jupiter:junit-jupiter'
testImplementation 'org.mockito:mockito-core:4.5.1'
}
test {
jvmArgs '--add-exports=java.management/sun.management=ALL-UNNAMED'
useJUnitPlatform()
}
Refresh the project, and Gradle will download the specified modules.
Local unit testing
It may be helpful to have some local test data you can use to test your query's correctness.
The following simple query calculates the average and mid prices of stocks for a given day:
import io.deephaven.enterprise.database.Database
db = (Database) db
date = today()
AvgStockPrices = QueryUtils.getAvgStockPriceTable(db, date)
MidStockPrices = QueryUtils.getMidStockPriceTable(db, date)
It uses helper methods which have been written in Java so that they can be unit tested:
import io.deephaven.engine.table.Table;
import io.deephaven.enterprise.database.Database;
public class QueryUtils {
public static Table getAvgStockPriceTable(final Database db, final String date) {
return db.historicalTable("LearnDeephaven", "StockTrades")
.where("Date = `" + date + "`")
.view("USym", "Last")
.avgBy("USym");
}
public static Table getMidStockPriceTable(final Database db, final String date) {
return db.historicalTable("LearnDeephaven", "StockQuotes")
.where("Date = `" + date + "`")
.lastBy("USym")
.updateView("Mid = (Bid + Ask) / 2");
}
}
We need some test data. Ten rows from the LearnDeephaven.StockTrades
dataset have been saved as src/test/resources/StockTrades.csv
, and ten rows from the LearnDeephaven.StockQuotes
dataset have been saved as src/test/resources/StockQuotes.parquet
.
Date,Timestamp,SecurityType,Exchange,USym,Sym,Last,Size,Source,ExchangeId,ExchangeTimestamp,SaleCondition
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Arca,AAPL,AAPL,159.37,15,Normal,1045,2017-08-25T13:19:17.419592579-04:00,@FTI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Arca,AAPL,AAPL,159.48,85,Normal,1046,2017-08-25T13:19:17.419592579-04:00,@FTI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Arca,AAPL,AAPL,159.48,75,Normal,1047,2017-08-25T13:19:17.419592579-04:00,@TI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Arca,AAPL,AAPL,159.48,25,Normal,1048,2017-08-25T13:19:17.419592579-04:00,@FTI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Nasdaq,AAPL,AAPL,159.55,70,Normal,1055,2017-08-25T13:19:17.419592579-04:00,@TI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Arca,GOOG,GOOG,922.6,1,Normal,1406,2017-08-25T13:19:17.419592579-04:00,@TI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Arca,GOOG,GOOG,924.7,1,Normal,1410,2017-08-25T13:19:17.419592579-04:00,@TI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Nasdaq,GOOG,GOOG,924.7,1,Normal,1418,2017-08-25T13:19:17.419592579-04:00,@TI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Nasdaq,GOOG,GOOG,922.9,11,Normal,1420,2017-08-25T13:19:17.419592579-04:00,@FTI
2017-08-25,2017-08-25T13:19:17.419592579-04:00,Stock,Nasdaq,GOOG,GOOG,922.9,8,Normal,1421,2017-08-25T13:19:17.419592579-04:00,@FTI
- The first step is to open an
ExecutionContext
so that we can use table operations likewhere
andupdateView
. It is best practice to close theExecutionContext
after you are done with it. The@BeforeAll
and@AfterAll
tags are used so that we only need to do this once for all tests.
@BeforeAll
public static void setup() {
/*
* Initialize the execution context for unit tests.
* Table operations must be run within an open execution context.
*/
final ExecutionContext executionContext = TestExecutionContext.createForUnitTests();
final ControlledUpdateGraph updateGraph = (ControlledUpdateGraph) executionContext.getUpdateGraph();
updateGraph.enableUnitTestMode();
updateGraph.resetForUnitTests(false);
executionContextCloseable = executionContext.open();
db = getTestDatabase();
}
@AfterAll
public static void cleanUp() {
/*
* Clean up the execution resources.
*/
executionContextCloseable.close();
}
- Mocking the
Database
to read test data in the form of CSVs or Parquet files is easier than creating a real Database instance. This example uses Mockito:
/**
* Mocking the database allows us to test our queries' correctness
* without the complication of creating a real database instance.
*/
private static Database getTestDatabase() {
final Database db = mock(Database.class);
addData(db);
return db;
}
/**
* The queries we are testing use historical table data. This method sets up the mock database to return
* the test CSV and Parquet data when db.historicalTable is called.
*/
private static void addData(final Database db) {
try {
final Table t = CsvTools.readCsv("src/test/resources/StockTrades.csv");
when(db.historicalTable("LearnDeephaven", "StockTrades")).thenReturn(t);
} catch (CsvReaderException e) {
throw new RuntimeException("Could not read StockTrades.csv test data", e);
}
final Table t = ParquetTools.readTable("src/test/resources/StockQuotes.parquet");
when(db.historicalTable("LearnDeephaven", "StockQuotes")).thenReturn(t);
}
- Use the methods described in the Core Extract table values guide document to test your queries:
@Test
public void testAvgPriceQuery() {
final Table testResult = QueryUtils.getAvgStockPriceTable(db, TEST_DATE);
Assertions.assertFalse(testResult.isEmpty());
final double sumAAPL = getFirstDoubleFromColumn(testResult.where("USym = `AAPL`"), "Last");
Assertions.assertEquals(159.472, sumAAPL, 0.0002);
final double sumGOOG = getFirstDoubleFromColumn(testResult.where("USym = `GOOG`"), "Last");
Assertions.assertEquals(923.56, sumGOOG, 0.0002);
}
@Test
public void testMidPriceQuery() {
final Table testResult = QueryUtils.getMidStockPriceTable(db, TEST_DATE);
Assertions.assertFalse(testResult.isEmpty());
final double midPFE = getFirstDoubleFromColumn(testResult.where("USym = `PFE`"), "Mid");
Assertions.assertEquals(33.285, midPFE, 0.0);
}
private static double getFirstDoubleFromColumn(final Table table, final String column) {
// To extract data from a specific cell in a Table,
// we first need the row key for the that row position.
// Row 0 does not necessarily have row key 0.
final long rowKey = table.getRowSet().get(0);
// Use the ColumnSource to get the value of the column for that row key
return table.getColumnSource(column).getDouble(rowKey);
}
The full test class:
package org.example;
import io.deephaven.csv.CsvTools;
import io.deephaven.parquet.table.ParquetTools;
import io.deephaven.csv.util.CsvReaderException;
import io.deephaven.engine.context.ExecutionContext;
import io.deephaven.engine.context.TestExecutionContext;
import io.deephaven.engine.table.Table;
import io.deephaven.engine.testutil.ControlledUpdateGraph;
import io.deephaven.enterprise.database.Database;
import io.deephaven.util.SafeCloseable;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;
public class UnitTest {
private static final String TEST_DATE = "2017-08-25";
private static Database db;
private static SafeCloseable executionContextCloseable;
@BeforeAll
public static void setup() {
/*
* Initialize the execution context for unit tests.
* Table operations must be run within an open execution context.
*/
final ExecutionContext executionContext = TestExecutionContext.createForUnitTests();
final ControlledUpdateGraph updateGraph = (ControlledUpdateGraph) executionContext.getUpdateGraph();
updateGraph.enableUnitTestMode();
updateGraph.resetForUnitTests(false);
executionContextCloseable = executionContext.open();
db = getTestDatabase();
}
@AfterAll
public static void cleanUp() {
/*
* Clean up the execution resources.
*/
executionContextCloseable.close();
}
/**
* Mocking the database allows us to test our queries' correctness
* without the complication of creating a real database instance.
*/
private static Database getTestDatabase() {
final Database db = mock(Database.class);
addData(db);
return db;
}
/**
* The queries we are testing use historical table data. This method sets up the mock database to return
* the test CSV and Parquet data when db.historicalTable is called.
*/
private static void addData(final Database db) {
try {
final Table t = CsvTools.readCsv("src/test/resources/StockTrades.csv");
when(db.historicalTable("LearnDeephaven", "StockTrades")).thenReturn(t);
} catch (CsvReaderException e) {
throw new RuntimeException("Could not read StockTrades.csv test data", e);
}
final Table t = ParquetTools.readTable("src/test/resources/StockQuotes.parquet");
when(db.historicalTable("LearnDeephaven", "StockQuotes")).thenReturn(t);
}
@Test
public void testAvgPriceQuery() {
final Table testResult = QueryUtils.getAvgStockPriceTable(db, TEST_DATE);
Assertions.assertFalse(testResult.isEmpty());
final double sumAAPL = getFirstDoubleFromColumn(testResult.where("USym = `AAPL`"), "Last");
Assertions.assertEquals(159.472, sumAAPL, 0.0002);
final double sumGOOG = getFirstDoubleFromColumn(testResult.where("USym = `GOOG`"), "Last");
Assertions.assertEquals(923.56, sumGOOG, 0.0002);
}
@Test
public void testMidPriceQuery() {
final Table testResult = QueryUtils.getMidStockPriceTable(db, TEST_DATE);
Assertions.assertFalse(testResult.isEmpty());
final double midPFE = getFirstDoubleFromColumn(testResult.where("USym = `PFE`"), "Mid");
Assertions.assertEquals(33.285, midPFE, 0.0);
}
private static double getFirstDoubleFromColumn(final Table table, final String column) {
// To extract data from a specific cell in a Table,
// we first need the row key for the that row position.
// Row 0 does not necessarily have row key 0.
final long rowKey = table.getRowSet().get(0);
// Use the ColumnSource to get the value of the column for that row key
return table.getColumnSource(column).getDouble(rowKey);
}
}
Connect to a remote DB
Client applications can connect to Deephaven server installations to run queries on a remote database. See the Core+ Java Client for more information.