The company enables smarter decision-making by accelerating the flow of data-driven insights. The company’s semantic layer platform simplifies, accelerates, and extends business intelligence and data science capabilities for enterprise customers across all industries. The company empowers customers to democratize data, implement self-service BI, and build a more agile analytics infrastructure for more impactful decision-making.
You will be a core contributor to the design and development of cutting-edge technologies used by the world’s largest organizations for data analytics. This position will collaborate with product designers and other technical leads to tackle complex problems in analytics computation and data management. You will not only tackle complexities of algorithm and data structure design but also integrate with modern technologies for data warehousing (e.g. snowflake), data engineering (e.g. dbt), and analytics. Further, you’ll be responsible for considering the architectural implications of deployment and infrastructure concerns such as cloud and container technologies.
- Design, build, orchestrate, and automate infrastructure, applications, and monitoring tools;
- Help establish best practices, document designs, and mentor junior team members;
- Define requirements, estimate work, track dependencies, report progress, highlight blocker;
- BA/BS preferred in a technical or engineering field;
- 4+ years experience;
- Experience with cloud-native development with cloud providers like AWS, MSFT Azure, or GCP (e.g. AWS – API Gateway, Lambda, SMS, ECS, EKS, etc.);
- Experience with infrastructure-as-code and technologies such as Terraform;
- Experience with Docker and orchestration technologies such as Kubernetes and Helm charts;
- Experience designing robust systems for HA, Fail Over, and Disaster Recovery;
- Experienced in the design, configuration, and maintenance of services for metrics, logging, and monitoring of the platform using tools such as ELK, Prometheus, and Grafana;
- Experience with CI/CD and testing and familiarity with tools such as Jenkins and Git workflows;
- Familiarity with cloud security considerations such as NACLs, Security Groups, RBAC, etc;
- Familiarity with different types of databases and cloud-native variants (e.g. Snowflake, BigQuery, RDS, Redshift, Neptune, DynamoDB, Cosmos, Athena);
Preference will be given to candidates with
- Experience contributing to a production code base;
- Experience with DevOps best practices and tools such as Github, related automation, etc;
- Experience designing and automating ETL/ELT workflows using cloud services such as AWS data pipelines, Glue, EMR, and airflow or Azure DataFactory;
36,000~52,8000 USD / year
Join a team of passionate people committed to redefining the way business intelligence and AI is done.