Website Tailored Management
Tailored Management
Company : Tailored Management
Data Scientist: Data Engineering & BI Expert
Rate: $63.31 Per hour
Location: 100% Remote (Candidates must reside in EST or CST time zones)
Contract Duration: 6-months
The Mission
We are looking for a hybrid Data Scientist who balances technical engineering rigor with a sharp business mind. This isn't just about moving data it's about owning high-priority projects like A/B testing analytics and enterprise-grade reporting. You will build the pipelines, design the dashboards, and ensure the business has the insights needed to move fast.
High-Priority Impact Areas
-
Experimental Analytics: Develop and automate sophisticated A/B testing reports to measure project success.
-
BI Excellence: Architect and manage high-performance Power BI dashboards.
-
Data Infrastructure: Leverage Databricks to build and optimize robust data pipelines.
Key Responsibilities
-
Pipeline Engineering: Design and maintain automated ETL workflows using Python and Databricks Jobs.
-
End-to-End BI: From data gateways to final visualizations, own the entire lifecycle of reporting (Power BI, Tableau, or Looker).
-
Data Governance: Ensure the "Gold Standard " of data quality, consistency, and reliability across all enterprise systems.
-
Business Partnership: Collaborate with stakeholders to translate business questions into technical requirements and actionable insights.
-
DevOps Integration: Manage code repositories via GitHub Enterprise and implement CI/CD principles for data workflows.
What You'll Need to Succeed
-
The Tech Stack: Advanced proficiency in SQL, Python, and Databricks.
-
Visualization Mastery: Deep experience in dashboard development and connecting to various cloud platforms ( Azure).
-
Engineering Mindset: Familiarity with CI/CD, GitHub Enterprise, and optimizing for scalability.
-
Independence: A self-starter who takes total ownership of projects from day one.
Preferred Qualifications
-
2+ years in retail or enterprise reporting environments.
-
Hands-on experience with PySpark for large-scale data processing.
-
Comfort working in Agile development cycles.
