Filtered by Pune, India
Infostretch is a pure-play digital engineering services firm focused on helping companies accelerate their digital initiatives from strategy and planning through execution. We leverage deep technical expertise, Agile methodologies and data-driven intelligence to modernize systems of engagement and simplify human/tech interaction. We deliver custom solutions that meet customers’ technology needs wherever they are in their digital lifecycle. Backed by Goldman Sachs and Everstone Capital, Infostretch works with both large enterprises and emerging innovators -- putting digital to work to enable new products and business models, engage with customers in new ways, and create sustainable competitive differentiation.
Specific Responsibilities also include-
· 10+ Previous experience in large Data Warehouse / BI / Analytics implementation
· Key stakeholder in influencing the roadmap of Infostretch’s Digital Therapeutics Data Platform.
· Managed or worked with teams larger than 15+
· Experienced in one of the legacy platforms Teradata, Netezza, Exadata
· Build, operate and maintain highly scalable and reliable data pipelines to enable data collection from various sources.
· Enable analysis and generation of insights from structured and unstructured data.
· Build Datawarehouse solutions that provide end-to-end management and traceability of patient longitudinal data, enable and optimize internal processes and product features.
· Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
· Deploy, support and productize analytics & visualization solutions to help improve food delivery process.
· Build and develop tools to support the use of AI / ML and other analytical models.
· Work with product and data science teams to leverage batch & streaming - unstructured, IoT, image.
· Collaborate with internal stakeholders to develop business domain concepts and data modeling approaches to problems faced by the organization in the analytics arena.
· Maintain and optimize existing data platform services and capabilities to identify potential enhancements, performance improvements, design improvements.
· Writes & maintain unit/integration tests, systems documentation.
Desired Skills and Experience Minimum qualifications are:
· Extremely strong skills in at-least one programming and scripting language (Java, Python, Julia, Ruby, Go).
· Has built and deployed into production large-scale batch and real-time data pipelines using technology such as AWS Stack, Airflow, Spark, Cloudera, HortonWorks, H20.
· Deep understanding of how big data specific algorithms work and have experience building and maintaining high-performance algorithms
· Deep experience with AWS Big Data platform and services. (Redshift, Redshift Spectrum, S3, Glacier, DynamoDB, Parquet/Avro/ORC, EKC, ECS).
· Experience with one or more data analytics and visualization packages (Tableau, Quicksight, MicroStrategy).
· Strong communications skills for working with stakeholders with various backgrounds
· Expert knowledge of scaling and tuning large-scale distributed SQL and NoSQL systems.
· Strong quantitative, analytical, process development, facilitation and organizational skills required.
· Ability to multi-task, prioritize assignments and work well under deadlines in a changing environment with a cross functional agile team.
· 5+ years of experience in building and sustaining big data solutions, preferably in the Food & Beverage or a regulated industry