Senior Backend Engineer - Analytics, Noida job opportunity at Level AI.



Date2025-08-20 bot
Level AI Senior Backend Engineer - Analytics, Noida
Experience: 3-years
Pattern: Full Time
apply Apply Now
Salary:
Status:

Noida

Copy Link Report
degreePhD
loacation Noida, India
loacation Noida....India
Auto GPT Summarize Enabled

<p><span style="font-size: 16px;">Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionises customer engagement by transforming contact centres into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organisations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.</span></p><p><br></p><p><br></p><p><b style="font-size: 12pt;">Competencies:</b></p><p><br></p><p><b style="font-size: 12pt;">Data Modelling:</b><span style="font-size: 12pt;"> Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalization and denormalization techniques.</span></p><p><b style="font-size: 12pt;">Data Warehousing &amp; Storage Solutions</b><span style="font-size: 12pt;">: Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.</span></p><p><b style="font-size: 12pt;">ETL/ELT Processes</b><span style="font-size: 12pt;">: Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, dbt) to facilitate data movement from source systems to the data warehouse.</span></p><p><b style="font-size: 12pt;">SQL Proficiency</b><span style="font-size: 12pt;">: Advanced SQL skills for complex queries, indexing, and performance tuning.</span></p><p><b style="font-size: 12pt;">Programming Skills</b><span style="font-size: 12pt;">: Strong in Python or Java for building custom data pipelines and handling advanced data transformations.</span></p><p><b style="font-size: 12pt;">Data Integration</b><span style="font-size: 12pt;">: Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.</span></p><p><b style="font-size: 12pt;">Data Pipeline Management</b><span style="font-size: 12pt;">: Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines.</span></p><p><b style="font-size: 12pt;">APIs and Data Feeds</b><span style="font-size: 12pt;">: Knowledgeable in API-based integrations, especially for aggregating data from distributed sources.</span></p>\n<p></p><p><br></p><b>Responsibilities - </b><ul><li>Design and implement analytical platforms that provide insightful dashboards to customers.</li><li>Develop and maintain data warehouse schemas, such as star schemas, fact tables, and dimensions, to support efficient querying and data access.</li><li>Oversee data propagation processes from source databases to warehouse-specific databases/tools, ensuring data accuracy, reliability, and timeliness.</li><li>Ensure the architectural design is extensible and scalable to adapt to future needs.</li></ul><div><br></div><p><br></p><b>Requirement -</b><ul><li>Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1 Engineering institutes with relevant work experience with a top technology company.</li><li>3+ years of Backend and Infrastructure Experience with a strong track record in development, architecture and design.</li><li>Hands-on experience with large-scale databases, high-scale messaging systems and real-time Job Queues.</li><li>Experience navigating and understanding large scale systems and complex code-bases, and architectural patterns.</li><li>Proven experience in building high-scale data platforms.</li><li>Strong expertise in data warehouse schema design (star schema, fact tables, dimensions).</li><li>Experience with data movement, transformation, and integration tools for data propagation across systems.</li><li>Ability to evaluate and implement best practices in data architecture for scalable solutions.</li></ul><div><br></div><div>Nice to have: </div><ul><li>Experience with Google Cloud, Django, Postgres, Celery, Redis.</li><li>Some experience with AI Infrastructure and Operations.</li></ul><p><br></p><p></p>\n<div>₹0 - ₹0 a year</div>\n<p>To learn more visit :&nbsp;<a rel="noopener noreferrer" class="postings-link" href="https://thelevel.ai/">https://thelevel.ai/</a></p><p>Funding :&nbsp;<a rel="noopener noreferrer" class="postings-link" href="https://www.crunchbase.com/organization/level-ai">https://www.crunchbase.com/organization/level-ai</a></p><p>LinkedIn :&nbsp;<a rel="noopener noreferrer" class="postings-link" href="https://www.linkedin.com/company/level-ai/">https://www.linkedin.com/company/level-ai/</a></p><p><span style="font-size: 10.5pt;">Our AI platform :&nbsp;</span><a rel="noopener noreferrer" class="postings-link" style="font-size: 10.5pt;" href="https://www.youtube.com/watch?v=g06q2V_kb-s">https://www.youtube.com/watch?v=g06q2V_kb-s</a></p>

Other Ai Matches

Technical Account Manager(Remote) Applicants are expected to have a solid experience in handling Job related tasks
Solutions Architect (Remote) Applicants are expected to have a solid experience in handling Job related tasks
Forward Deployed Engineer - Agents(Remote) Applicants are expected to have a solid experience in handling Job related tasks
Senior Implementation Manager - Remote, US Applicants are expected to have a solid experience in handling US related tasks