Bronze Drum Lattice: QData and QCompute Solution for Quantitative Finance

Bronze Drum Consulting Presents

QData and QCompute

Solutions for Quantitative Finance on AWS


The Company

Bronze Drum Consulting, Inc (incorporated in 1997) and Carbon09, Inc. (incorporated in 2016) are sibling firms. Carbon09, Inc. is the software and web services entity. Bronze Drum is the consulting, strategy, professional services firm.

Founder and CEO: Brian McCallion. Brian has worked on Cloud Strategy and solutions since the early days. Brian is responsible for several key AWS features, having worked in the early days with the AWS product team and Fortune 500 firms building production applications on AWS. Today Brian manages a ten person team of carefully selected professionals each of whom brings unique capabilities to our customers.

Core Team
Claudio Torres brings over twenty years of experience as a hands on financial engineer with experience in derivatives curves, algorithmic trading at firms such Nomura, BNY Mellon, Credit Suisse, Bank of America, JP Morgan Chase.

David Secrest, our de facto genius in residence, studied Particle Physics in Princeton University’s elite Doctoral Program and was the only one in his class of 20 to focus on basic research.

Origin of Our Quant Data and Quant Compute Solution

In 2014 while consulting with a Swiss data provider and a key financial institution we recognized an opportunity to address challenges faced by many firms. As a result we created a service where we provide firms with a private marketplace and lab where quant developers:

1.       Subscribe to data and may ingest the data formatted just as required

2.       Run their compute at scale, with a lightweight visual “step language” by which to describe how functions, data inputs, and function outputs are created.

3.       Maintain a clear lineage for each data attribute.

4.       Clearly meter the data and the requests made on a developer key basis, including usage plans

5.       Custom negotiated pricing per end user

6.       Managed distribution of open source risk library Quantlib

Featured Data Providers Include: Bloomberg Data License, Intex, Interactive Data, and more data options to come

Along the way we also found we solved a number of additional challenges faced by large and small organizations.

QData™ and QCompute™ simplify getting the data you need and attributing and metering that data across specific application components, developers, or firms. QData™ also simplifies the interface developers call by presenting the data required for a specific function via a documented API tailored to your specific requirements.

1.       Provides the Continuous Delivery and Batch Cluster environment to package your code, then execute that code across a cluster of thousands of nodes. While popular scripting languages and risk libraries are often single threaded, QCompute™ runs your computations in containers, and in this way take advantage of multi-core, GPU hardware.

2.       Dependencies. QCompute™ enables dependencies to be defined and jobs run and are executed in order of the dependencies defined. The inputs of one job are seamlessly provided to downstream jobs based on dependencies defined for that job.




Key Feature for Data Management

Data Metering and Usage Plans


 As soon as your users (subscribers) start to make calls to the APIs using their API Keys, their usage will be throttled and limited as specified in the plan. You can view their usage at any time by clicking on Usage:

Quotas are applied and respected in real time. Usage data can be up to 30 minutes behind.

You can download usage data for the plan by clicking on Export Usage Data:

<img width=537 height=239 id="Picture 4" src=""




Compute and Data

Carbon09 Lattice Offers both the data, the attribution, and the ability to run risk pricing from a single node cluster to clusters of more than a thousand nodes.



The Data Challenge QData™ Addresses

Quant Developers spend large amounts of time on what amounts to a “scavenger hunt” for the data required to calculate fundamental building blocks such as term structures, discount curves, OAS curves. In many firms different groups within the organization are responsible for producing specific numbers, such as interest rates. Other groups perform calculations using these rated as inputs. Smaller firms may source this information as Reference Data from one or more of the world’s capital market data providers.

The Challenge QCompute™ Addresses

Once found data must be wrangled incorporated into a specific datasets required for specific calculations. As each “number” must be traceable back to its source so as to reconcile the output of a calculation with industry models, attribution, or genealogy must be maintained. Further, as steps in a calculation include tens of thousands of iterations of compute intensive calculations, in the event a number changes, the downstream dependencies must be known.

Individual Quants can easily manage a cluster of 1000 nodes or more, without the need for an outside team to support them!

Both the data inputs  must be known and the order and dependencies of the functions and the data must be well understood. In this sense financial engineering would seem to warrant a scientific manufacturing process.

At the same time, models change continuously and trillions of dollars are hedged and leveraged based on the output of these calculations.

High Impact

Together QData™ and QCompute™ make your quant team more productive, improve accuracy, and provide detailed logging, and auditability for each calculation and each data request. By managing data with APIs your organization better understands the data it consumes, but also, new team members get up to speed quickly, because the data can be understood in relationship to the calculations they need to perform. By incorporating a Continuous Integration process that packages code to run on a scale-out cluster, your team spends more time on the things that matter to your business, and less on the undifferentiated lifting of compiling code, packaging code, and managing compute clusters.

Professional Services

A simple deployment can be complete in six weeks or less. Your solution will be up and running within days.

Rapid Delivery

The majority of implementation time is spent working with you and your team to optimize the benefits and tailor the solution to your specific requirements.

We work with your firm to define the inputs required, ensure you have the necessary data, and optimize how the data requests are returned to your developers. Behind the API we can implement functions which get exactly the data required to price a zero coupon bond, create a term structure, price a vanilla swap, or an exotic derivative like an FX TARN. Further, we work with you to define a Continuous Delivery and testing process such that as your Quants check-in code, it is compiled, tested, and staged for deployment when approved. We work with you to setup the dependencies between jobs so that developers need only check-in code, and mapped dependencies can recalculate pricing. And we ensure you and your team understand how to use the solution to accelerate computation, run cost efficient computing, and to simplify audit and verification.

Managed Service – We can setup the solution so you can operate it as a managed service. In most cases we can supply data as a subscription, and in cases where you must sign with the exchange or data provider we facilitate.

Proof-of-Concept – For larger organization we provide all the necessary components to evaluate the impact of different types of compute instances, including GPU, FPGA.

Blueprints – We deploy all solutions from a blueprint, known as a CloudFormation template. This enables your team to extend the solution, modify changes, and create entirely new deployments within your organization. As a self-documenting artifact customers appreciate the ability to know every aspect of their compute and data infrastructure.

Call us Now at 646 308-1257 to get started!