Prisma Photonics, a fast-growing startup, transforms infrastructure monitoring with optical fibers. We eliminate the need for extra sensors by offering sensor-free solutions to monitor electrical power grids and oil & gas pipelines across thousands of kilometers.
Our fiber-sensing technology, integrated with AI and machine learning, enables our customers to achieve environmental and renewable energy targets, ensuring smooth utility operations on their path to net-zero emissions.
We are looking for the best minds and spirits to join us in our journey. We know our product is only as great as the individuals building the hardware and software and harnessing data for good causes. Being a great team member means being eager to learn and grow, able to challenge while accepting being challenged, and working for the team and the product with enthusiasm and passion.
We are now hiring a talented, self-driven and passionate Data Engineer to build and maintain optimized and highly available data pipelines that facilitate deeper analysis and reporting.
Responsibilities:
- Design data collection, storage, search and access APIs.
- Develop the data storage and meta data databases for our AI and Algorithms teams to access train and test.
- Design and implement data pipelines and ETLs both for on-prem and cloud environments.
- Make data easily and fast accessible and searchable.
- Work closely with Software, AI and Algorithm teams on system integration, benchmarking and data access.
- Bachelor’s degree in a quantitative field such as math, computer science, engineering, etc.
- Know How and experience in building databases and organizing it.
- 2-3 years of experience with SQL and NoSql DBs (MySql, Elastic, MongoDB, Postgres).
- Strong programming skills – mostly python.
- Experience with design and implementation of data pipelines.
- Strong collaborator with teams and peers
- Innovative with a growth mindset
Advantages:
- Experience with streaming technologies, unstructured data and different data types (other than text and tables)
- Additional software programming experience
- Experience with any of the following: Airflow, Jenkins, Git, Kubernetes, distributed computing.
- Cloud provisioning and administration experience.
- Working on cloud environment (preferably AWS).
- Knowledge of data analysis tools s.a. Grafana or DataDog.