- Data Science, Modeling, and Insights
- • Build statistical and machine-learning models to detect anomalies, forecast performance, and identify optimization opportunities
- • Design and evaluate experiments and model validation approaches; translate results into clear recommendations for engineering and operations
- • Develop dashboards, reports, and model performance metrics to communicate insights and drive data-informed decisions
|
- PI System Administration & Time-Series Data Engineering
- Administer and support the PI System (OSIsoft PI / AVEVA PI), including tag strategy, data quality monitoring, and user support
- Build and maintain PI AF structure (assets, templates, attributes) and documentation to provide governed context for analytics and reporting
- Support PI interfaces/data flows and collaborate with OT/IT and engineers to validate sensors/tags, troubleshoot gaps, and improve reliability and performance
- Create curated datasets, features, and labels from PI data (with clear definitions and lineage) to support Seeq analyses and ML modeling
|
- Seeq Development & AI-Enabled Tools
- Develop and maintain Seeq Workbooks/Analyses for performance monitoring, anomaly detection, and root-cause investigations
- Create reusable Seeq templates, calculation standards, and best practices; enable users through documentation and training
- Build AI-enabled tools (e.g., copilots, guided diagnostics, automated summaries) that leverage governed PI/Seeq context to accelerate engineering workflows
- Evaluate, monitor, and improve AI tool quality (accuracy, drift, user feedback), and implement practical guardrails for safe, reliable use
|
- Python, Analytics Engineering & Deployment
- Develop and maintain Python-based pipelines for data extraction, preprocessing, modeling, and automation
- Prototype and productionize analytical applications that support performance monitoring, anomaly detection, and forecasting
- Automate recurring model runs, evaluations, and reporting workflows with attention to reproducibility and reliability
- Improve existing analytics codebases; contribute to model monitoring, documentation, and maintainable data science practices
|
- Project & Engineering Partnership
- Collaborate with engineers and subject matter experts to frame operational problems into measurable data science objectives
- Provide analytical support for initiatives including data validation, statistical analysis, modeling, and performance reporting
- Help standardize modeling approaches, feature definitions, and evaluation metrics across projects
|
- Data Quality, Governance & Monitoring
- Ensure accuracy and reliability of datasets used for analysis and modeling (validation checks, outlier handling, sensor sanity checks)
- Perform data cleaning, validation, and documentation, including assumptions, feature definitions, and dataset lineage
- Maintain organized analytical workflows and pipelines to support repeatable modeling and ongoing monitoring
|
|
|