The SDE I/II Data Engineer will be responsible for:
- Discovering data sources and determining the best EL approach and tools to bring data into the DW
- Determining the data cleansing rules based on source data and ensuring data replication accuracy
- Coordinating and collaborating with data architects and BI engineers to ensure timely project delivery
- Understanding data warehouse concepts and implementation methods of warehouse objects
- Understanding the domain, business cases, objectives, and KPIs that need to be reported and analyzed
Perch Insights
SDE I/II Data Engineer
Company Overview:
Perch is an AI driven data analytics platform for customer experience (CX) organizations to drive excellence through better data-driven decision making. No other CX business intelligence platform structures data from multiple data sources such as CRMs and customer service platforms, contact center applications, chatbots, marketing platforms, and HR systems into a single source of truth and then provides actionable insights like Perch. Perch offers the best way for CX leaders and their colleagues across the enterprise to see the forest through the trees to increase market share, improve the customer journey and increase operational efficiency. We are achieving these goals by leveraging the latest technologies in AI, data and software engineering.
Here are some of the core product modules that we are working on every day:
- AI Copilot provides a revolutionary new analytics experience where users ask business questions and the AI interprets and responds with relevant charts and insights to answer those questions. This is actively being built using the latest GPT/LLM technologies along with our own semantic layer and AI models to generate insights and answers.
- Guided Analytics provides dashboards for fast, actionable root cause analysis powered by AI-driven contribution-to-change (CTC) models that help identify which factors are driving KPIs higher or lower the most.
- Proactive Alerts monitor the data for real-time insights and push them to users for faster action and higher impact across multiple channels.
- CX Data Platform combines a standard data model based on our domain expertise with best-of-breed, scalable data pipelines and transformations leveraging DBT, Airflow, Airbyte and other modern data stack technologies.
The Perch Founding Team has a unique set of experiences/expertise across business operations, analytics and engineering in the CX and contact center industries. The Perch founders have built Full Potential Solutions (which is Perch’s parent company) into a large, rapidly growing business with thousands of employees in five years, and launched Perch to take advantage of the massive market opportunity in AI-driven data analytics specifically in the CX space. Perch has a unique culture that is focused on conscious leadership. We believe that increasing self-awareness leads to maximum growth for each individual and the team as a whole. And we believe that work should be an environment to practice how to grow and become a better person each day, and through that process we will also achieve greatness as a company. We strive to live each day by our core Perch Insights values of curiosity, candor, integrity, ambition, accountability, and mindfulness. So if this sounds exciting to you, then apply to join our team.
Responsibilities
Qualifications
Technical Requirements
- Bachelor’s degree in a Technical/Quantitative subject such as Computer Science (B.E/B.Tech/MCA)
- 2 – 4 years of relevant experience in the field of Data Engineering and Data Warehouse/BI solutions
- Proficient in SQL, query optimization, coding stored procedures, and ad-hoc data analysis using SQL
- Develop data pipelines using some combination of ETL/ELT tools and data processing frameworks
- Develop DBT models as per the data model and implement the data transformations & test cases
- Orchestrate data integration & transformation processing from sources to ssots in data warehouse
- Hands-on with languages like Spark, Python, Scala, R, Bash/Shell Scripting
- Must have worked with RDS or data lakes (MySQL, PostgreSQL, Oracle, Redshift, Snowflake, etc.)
- Hands-on experience in using ETL/ELT tools like Talend, Informatica, SSIS, Airbyte, Alteryx, Stitch etc.,
- Experience working with orchestration tools such as Airflow, Astronomer, Control-M, Prefect etc.
Skills that will give you an edge (Good to Have)
- Well-versed with the data warehouse modeling concepts (star/ snowflake/ denormalized structures)