Description
POSITION Software
Engineer II / III
LOCATION Bangalore, India
ORGANIZATION OVERVIEW
HashedIn by Deloitte is a born in the cloud, high-end software engineering and product development
company. We develop market-leading solutions by leveraging cloud-native technologies, AI
capabilities, and a pod-based delivery model.
We have successfully served more than 100 customers since our inception across industries and
continents and helped them launch new products faster, disrupt industries, and streamline and scale
operations.
SKILLS & REQUIREMENTS
• Good knowledge of a web framework, preferably Python Flask or similar.
• Experience in cloud-based CI/CD service offerings from cloud service providers, preferably
AWS CodeBuild, CodePipeline.
• Experience with various other AWS services like EC2, S3, Lambda, Step Functions, Glue, SNS, SQS,
Secret Manager, etc.
• Strong understanding of basics of SQL reading and writing SQL queries, basic
understanding of database interaction tools (like pgAdmin), SQL, columnar databases, and
database optimization techniques including indexing, etc. Also, good to have knowledge on
AWS Aurora.
• Experience with Big Data frameworks like Hadoop and Spark is a plus.
• Good knowledge of API development and testing including but not limited to HTTP, RESTful
services, Postman, and allied cloud-based services like API Gateway.
• Strong coding/debugging/problem-solving abilities and should have good knowledge of
Python. Good to have experience with pip, setuptools, etc.
• Should have an eye for architecture. Candidates should understand the trade-off between
architectural choices, both on a theoretical level and an applied level.
• Technical background in data with a deep understanding of issues in multiple areas such as
data acquisition, ingestion and processing, data management, distributed processing, and
high availability is required
• Quality delivery is the highest priority. Should know about industry best practices and
standards in building and delivering performant and scalable APIs
Must have delivered a complex project. Should have completely/partially owned delivery of
projects catering to processing user request, querying data on huge datasets, serializing and
returning responses in an optimized and efficient manner.
• Possesses demonstrated expertise in team management as well as working as an individual
contributor.
RESPONSIBILITIES
• Works at the intersection of infrastructure and software engineering by designing and
deploying data and pipeline management frameworks built on top of open-source
components, including Hadoop, Hive, Spark, HBase, Kafka streaming, Tableau, Airflow,
and other cloud-based data engineering services like S3, Redshift, Athena, Kinesis, etc.
• Collaborate with various teams to build and maintain the most innovative, reliable, secure,
and cost-effective distributed solutions
• Design and develop the backend for efficient CRUD operation on data that can span more
than a million records and growing.
• Deliver the most complex and valuable components of an application on time and to the
specifications
• Plays the role of a leader or individual Contributor who influences a sizable portion of an
account or small project in its entirety, demonstrating an understanding of and consistently
incorporates practical value with theoretical knowledge to make balanced technical
decisions
• Recognizes requirements inconsistencies, accurately schedules, and track progress
providing visibility and proactively alerts the team and reporting authority on the same
• Excellent work breakdown and estimation. Writes clear and concise specifications for
outsourced work, creates a work breakdown structure that uses existing services to deliver a
functional implementation, and supports the development team with significant product
decisions; seen as a major contributor to architecture, feature set, etc., of product releases
• Actively participates in customer communication, presentations, and handling critical issues • Leads assigned client and company resources in performing their roles on time and within
budget
• This role would be an individual contributor who is a role model for the application of team
software development process and deployment process, also contributes to best practices
and methodologies for the greater team
Technical Skills (Minimum):
• Proficiency in Python.
• Experience with a Python web framework like Django or Flask.
• Experience with RDBMS like PostgreSQL, MySQL or Oracle.
• Experience with AWS basic services like EC2, S3 and RDS.
Technical Skills (Good to Have):
• Experience with AWS serverless services like Lambda, Step Functions, EMR/Glue, API
Gateway, etc.
• Advanced knowledge of RDBMS – optimizing queries using query plans, indexes,
PL/SQL.
• Experience in Big Data frameworks like Spark and Hadoop.
• Experience working with CI/CD systems, preferably AWS CodePipeline and CodeBuild.