Careers

 

Home Careers

Join Us

Develop your technology career at the intersection of growth and innovation

  • IOT Architect
  • Data Engineer

IOT Architect

Location: Boston MA
Duration: Full-time
 

Job Description:

 
  • Assessment of AWS platform running IOT based payloads that is supporting 3.3 million users and identify technical debt in the current state
  • Understanding the current state of the Platform and current Automation
  • Understand and document application architecture, technology landscape, current deployment cycle and activities performed.
  • Understand and document the current tooling set up and integration of various pipeline for in-scope application and publish a report on the same.
  • Review the code for quality, security, scale and stability for the in scope applications.
  • Evaluate continuous delivery, monitoring, configuration management, security within DevOps pipelines and create a back log to implement continues delivery pipes using DevOps principles.
  • Review existing build and release process and also understand the continuous testing set-up for both functional and non-functional testing.
  • Review test data and environment set-up process.
  • Review current tools and infrastructure set-up for DevOps and recommend improvements.
  • Identify current Service Level Indicator (SLI) / Service Level Objectives (SLO) and measure the current availability and reliability.
  • Identify the metric to monitor to ensure appropriate reliability.
  • Define how to calculate the SLI for the metrices.
  • Set a target as the SLO to compare against during use.
  • Define steady state as a measurable output for normal behavior of the platform.
  • Build a hypothesis around steady state behavior to scale for supporting 2x growth to around 7.5 million user base in next 2 years.
  • Need to identify availability 9s for the platform and draw a road map to achieve Our target (99.99%).
  • Risk Analysis of the current platform to understand the break point.
  • Introduce variables that reflect real world events for identification of risk points, breaking points due to increase load and define the point of failure.
  • Build resilience to scale.

Data Engineer

About the Role:

We are seeking a Senior Data Engineer to join our team and enhance our data infrastructure and workflows. You will be responsible for designing and implementing scalable data models and pipelines, ensuring data quality, and improving ETL processes.

 

Responsibilities:

 
  • Collaborate with business partners and stakeholders to understand data requirements.
  • Work with engineering, product teams, and customers to collect and integrate data.
  • Design, develop, and implement high-performance data models and pipelines for Data Lake and Data Warehouse.
  • Develop and enforce data quality checks, conduct QA, and set up monitoring routines.
  • Enhance the reliability and scalability of ETL processes.
  • Manage a portfolio of data products that ensure high-quality and trustworthy data.
  • Onboard and support new engineers joining the team.

  • 5+ years of professional experience in data engineering, business intelligence, or a related role.
  • 3+ years of experience with ETL orchestration and workflow management tools (like Databricks, Airflow, Flink, Oozie, Azkaban) using AWS/GCP.
  • Proficiency in Python, dbt, SQL, and distributed computing with strong database fundamentals,
  • 3+ years of experience with distributed data ecosystems (like Spark, Hive, Druid, Presto) and streaming technologies (like Kafka, Flink).
  • Experience with Snowflake, Redshift, PostgreSQL, or other DBMS platforms.
  • Familiarity with reporting tools like Tableau, Superset, or Looker.
  • Excellent communication skills; experience working with technical and non-technical teams.
  • Self-starter with strong organizational skills, comfortable in a fast-paced environment.
  • Strategic thinker with the ability to analyze and interpret market and consumer data.

  • Competitive salary
  • Comprehensive health, dental, and vision insurance
  • Opportunities for growth and development
  • A collaborative and innovative work culture
 

ScalarEdge is looking for a Senior Data Engineer who is ready to make a significant impact by building scalable data solutions that drive our mission forward. If you're passionate about data and eager to work in a dynamic, fast-growing startup environment, we encourage you to apply.

 

Senior IoT Systems Engineer

Location: Houston, TX

 

The Team:

You'll be part of an innovative engineering team that specializes in the development and deployment of advanced Industrial IoT systems. The team collaborates closely with cross-functional departments, including networking and integration partners, to deliver reliable, high-performance solutions. Your contributions will be essential to ensuring seamless data flow, system functionality, and integration across diverse platforms. The team is currently focused on key initiatives including Data Broker configuration, Timeseries Historian management, and Operations Hub development, all managed and deployed via cloud platforms like AWS and Edge platforms using Kubernetes.

 

The Role:

As a Senior IoT Systems Engineer, you will design and implement backend solutions using Rust for MQTT brokering, Historian, and Data Operations. You will develop and maintain rules, schemas, and APIs to aggregate data, calculate KPIs, link data sources, and publish data as MQTT payloads. Your work will also involve creating robust and scalable APIs to facilitate seamless data connectivity and management across the enterprise. Collaborating with cross-functional teams, including front-end developers and business analysts, will be key to ensuring successful project delivery.

You will also be responsible for data management tasks such as creating efficient data aggregation processes and developing calculations and methodologies to accurately report key performance indicators (KPIs). Your role will be pivotal in ensuring the smooth deployment and management of IoT systems in complex environments.

 

What You'll Bring:

Education: Bachelor's degree in Computer Science, Engineering, or a related field. Equivalent work experience will also be considered.

Experience: Minimum of 3-5 years of experience in backend development, with a focus on Rust or C programming languages.

Technical Skills: Proficiency in Rust and experience with system-level programming.

Basic understanding of MQTT protocol or brokering solutions.

Experience with historian systems or general data storage solutions.

Knowledge of API development and integration.

Familiarity with data aggregation and schema creation.

Experience with containerization and orchestration tools like Docker and Kubernetes is a plus.

 

Soft Skills:

 

Added Bonus if You Have:

  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.
  • Ability to work in a fast-paced, dynamic environment.
  • Self-motivated with a passion for continuous learning and development.

  • Experience with Industry 4.0 technologies, including publish/subscribe brokers and the MQTT protocol.
  • Familiarity with High-Speed Time Series Historians like MongoDB, TimescaleDB, or InfluxDB.

  • Proficiency in using Azure DevOps for project planning.

  • Familiarity with AWS cloud platform tools like Greengrass and IoT Core.
  • Understanding of basic message standards like JSON and Industry 4.0 standards like Sparkplug B.

Job3

xxxxxx

...
...

Job4

xxxxx