Deephaven Community release 0.29 brought forth some new features, quality-of-life improvements, and better API documentation. This release also includes a number of bug fixes and performance improvements. For full release notes, see the GitHub release page.
New features
Function generated tables
Real-time table creation should be easy. Deephaven Community v0.29 introduces function-generated tables, which allow users to create ticking tables from pure Python functions. The function is then re-evaluated at regular time intervals, and updated data from the function flows seamlessly into your table. These intervals can be determined by a constant, or by one or more ticking tables that act as the trigger.
For an example use case, see Create ticking tables with pure Python functions, where Alex uses function generated tables to fetch weather data from tomorrow.io.
Kafka Protobuf support
Deephaven now supports Kafka protobuf via classpath. This allows callers to use the descriptor from a protobuf class to parse Kafka payloads. This new addition adds to the existing Kafka protobuf schema registry support.
Liveness scope API
Deephaven's Liveness Scope API gives users a finer degree of control over the query update graph. More specifically, it allows users to clean up unreferenced nodes in the graph without having to rely solely on garbage collection. The API allows:
- The
liveness_scope
function to be used in awith
expression or as a decorator:
with liveness_scope() as scope:
ticking_table = some_ticking_source()
table = ticking_table.snapshot().join(table=other_ticking_table, on=...)
scope.preserve(table)
return table
@liveness_scope
def get_values():
ticking_table = some_ticking_source().last_by("Sym")
return dhnp.to_numpy(ticking_table)
- The
LivenessScope
class to be used directly for greater control:
def make_table_and_scope(a: int):
scope = LivenessScope()
with scope.open():
ticking_table = some_ticking_source().where(f"A={a}")
return some_ticking_table, scope
t1, s1 = make_table_and_scope(1)
# .. wait for a while
s1.release()
t2, s2 = make_table_and_scope(2)
# etc
R client update_by
update_by
is now available in the R client. This powerful operation can compute cumulative or rolling metrics such as sum, standard deviation, EMA, and more. Yet another tool to add to your "R"senal.
Quality of Life Improvements
Python type hints
Deephaven can now take even more advantage of Python functions that use type hints. Not only are scalars supported...
from deephaven.dtypes import Instant
from deephaven.time import dh_now
from deephaven import empty_table
import numpy as np
def int_func(val) -> np.intc:
return val + 3
def double_func(val) -> np.double:
return val * np.pi
t = empty_table(1).update(["X = i", "IntCol = int_func(X)", "DoubleCol = double_func(X)"])
t_meta = t.meta_table
- t
- t_meta
...but lists and NumPy arrays are supported, too:
from deephaven import empty_table
from numpy import typing as npt
from typing import List
import numpy as np
def list_func(val) -> List[np.int32]:
return list(range(val, val + 5))
def ndarray_func(val) -> npt.NDArray[np.int32]:
return list(range(val, val + 5))
t = empty_table(10).update(["X = i", "Y = list_func(X)", "Z = ndarray_func(X)"])
t_meta = t.meta_table
- t
- t_meta
Parquet
Our Parquet integration has also been upgraded significantly in the last few months. With v0.29, it now:
- Supports
LZ4_RAW
compression. - Is faster (especially when writing data to Parquet).
- Has better error handling.
API documentation
Deephaven's R client is the premier way to work with ticking data in R. Any good API deserves good documentation, and with Deephaven Community Core v0.29, it's here. To access the documentation, use R's ?
command. For instance, to see the documentation on table handles:
?TableHandle
Deephaven's Python API documentation has also been improved. You'll notice that the formatting and nomenclature has been fixed or updated on many pages.
Let's connect
We hope Deephaven's Community docs provide guidance and answer all your questions. We're happy to help in your querying adventures; reach out to us on Slack. Our community continues to grow!