Extract. Transform. Load.

Quix is a fully managed stream processing platform that helps teams transform data into products.

Trusted by developers at

Stream processing at its easiest.

With Quix, data teams develop, test & deploy Python code to process data using a broker, without any hassle.

Data sources

Normalize Json data to structured dataframes

def on_event_handler(json_data: json):        
    
    # The latest data read is in the json_data json.
    data = json.loads(json_data)
    df = pd.json_normalize(data['results']) # Normalized in a dataframe
    df = df.drop('var5', axis=1) # We don't need the column var5
    df = df.dropna() # Drop nulls

    # Publish/write df data into our import stream
    json_import_stream.parameters.buffer.write(df)




#

Engineer new features not available in source data

def on_data_handler(df: DataFrame):        
    
    # The latest data read from json_import_stream is in the dataframe df
    # Modify the temperature column from Celsius to Kelvin
    df['temp'] = 273.15 + df['temp']

    # Standardize the numerical var1. We need external data (avg & std)
    nonlocal avg  # Use the historical average
    nonlocal std  # Use the historical standard deviation
    df['var1'] = (df['var1'] - avg)/std

    # Publish/write df data into output stream
    prepared_stream.parameters.buffer.write(df)

json_import_stream.parameters.on_read_pandas += on_data_handler

Perform ML model predictions

ml_model = pickle.load("ml-model.pkl") # ML artifact loaded in memory as ml_model
def on_data_handler(df: DataFrame):        
    
    # List of variables the model was trained with
    var_list = ['temp', 'var1', 'var2', 'var3', 'var4']

    # Use the model to generate predictions and save in a new column
    df['pred_score'] = ml_model.predict(df[var_list])   
    df['pred_class'] = df['pred_score'].astype(int) # 0s & 1s

    # Publish/write df data into predicted_stream
    predicted_stream.parameters.buffer.write(df)

prepared_stream.parameters.on_read_pandas += on_data_handler
#

Act on the ML model results

def on_data_handler(df: DataFrame):

    # Let's say 1s are anomalies. Are there any?
    if df['pred_class'].sum() > 0:
      
      # Send anomaly id's to channel in Slack, for example
      slack.send("Alert", df.loc[df['pred_class']==1,'id'])

predicted_stream.parameters.on_read_pandas += on_data_handler





#

Data sources

Clean

Prepare

Predict

Act

Destinations

Destinations

“We had to hide a team for two years to build this — then it became mission critical.”

VP Software Engineering, Mobility industry

What can you do with Quix?

Easily build real-time data applications with our fully managed stream processing platform.
Detect financial fraud in real-time

Financial services

Prevent fraud and abuse. Accelerate payment approvals.

Finance use cases ->

Mobility solutions

Optimize fleet or product movement and delivery.

Mobility use cases ->

Gaming industry

Gauge audience sentiment and prevent chat abuse.

Gaming use cases ->

Telecom resiliency

Monitor systems to detect anomalies. Automate corrective action.

Telecom use cases ->

Integrate with your stack quickly.

From source to destination, Quix connectors handle your data in easy-to-manage streams.

Trust in a production-ready platform.

Fast

<10ms latency end-to-end — from client to cloud and back

Reliable

99.94% uptime last 90 days, SLAs available

Resilient

100% data delivery guarantee with built-in healing and replication

Secure

AES-256 SSL/TLS encryption throughout in-memory processing pipeline

We merge Kafka and Kubernetes in perfect matrimony.

And your apps lived happily ever after 🎉

Interested to learn more?

Chat with our friendly experts on Discord, or book time to talk about your project.