At Zipstack, we’re redefining how organizations of various sizes access and deal with data. Combining Data Virtualization and Data-as-Products, Zipstack lets organizations uniformly access and put to use data in hard-to-reach silos. We empower domain experts with low-code / no-code tools, enabling them to be data product owners. Be it data from logs, relational DBs, NoSQL DBs, lakes, warehouses, 3rd party data providers or SaaS applications, Zipstack can unify data from disparate sources letting business run at the speed of data.
The role described here is that of an individually contributing senior software development engineer. Your time will mostly be spent understanding the problem from architects and writing clean, maintainable and testable code. Working at an early stage startup means that your inputs to any problem you feel is worth tackling is more than welcome.
Zipstack Mesh is an implementation of best ideas from Data Mesh and Data Fabric. Through the Mesh platform, we help customers access data no matter where it resides, live and in-place. There are significant engineering challenges to be solved to make this frictionless, scalable and highly performant. There’s always the thrill of working at an early stage startup where ideas and execution are equally important.
While providing easy access to data is important, equally important is the ability to balance that with robust governance, access control, security and auditing. We follow a DevOps culture, meaning you own everything from end-to-end: from development to running Mesh in production for real customers with massive amounts of data.
Mastery of whichever programming language you’ve used mostly in your career.
Experience working on highly scalable web services
Experience having dealt with handling / processing large amounts of data at least nearing TB scale
Comfortable working in a DevOps culture where you own production environments. You should be comfortable with DevOps-related technologies like Docker, Kubernetes, etc. Else, you should be open to pick up modern DevOps-related technologies
Should be able to write testable code with delivery including a significant portion of it covered with unit tests. The practice of TDD is highly desirable
Sufficient understanding of the underlying operating system enough to aid in debugging various issues, but especially performance-related issues
Comfort in building and maintaining CI/CD pipelines
Solid understanding of basic principles of security: authentication, authorization, encryption, PKI, etc
Good understanding of RDBMs, some NoSQL databases usage
Excellent communication and documentation skills
Highly collaborative with and between teams
Experience with Java is highly desirable, though not essential. You must be open to switching to Java
Experience with Python is highly desirable
Comfort working with a microservices-based architecture
Experience with Spark, Presto, Flink, Hadoop
Experience with Apache Airflow
Experience building ETL pipelines
Experience with NoSQL and relational databases at scale