Qualifications:
- 1-2 years of hands-on experience with Big Data technologies (Spark) and cloud platforms (AWS, Azure, GCP).
- Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or a related technical field.
- Strong object-oriented programming skills in Java and Python.
- Solid understanding of distributed systems, algorithms, and relational databases.
- Excellent verbal and written communication abilities.
- Strong troubleshooting skills to identify root causes of issues.
- Ability to thrive in a fast-paced, innovative environment.
- Capable of designing and coding solutions for broadly defined problems.
Roles & Responsibilities:
- Design and implement scalable, low-latency, high-availability, and performant applications/architectures.
- Drive best practices and promote engineering excellence.
- Collaborate with team members to design and develop new and existing system architectures.
- Work in an agile environment to deliver high-quality software.
- Develop functional components, participate in code reviews, perform testing, handle deployments, and conduct post-launch monitoring.