Data Analyst
BCforward private limited
Full-time | Contract
Columbus, OH
Job description
Job Responsibilities
We are building and standing up a modern data system to support enterprise-wide data management, analytics, and advanced reporting capabilities. The system will leverage data modeling, data lakes, and data science workflows to ensure scalability, reliability, and actionable insights. This role will involve working with cutting-edge technologies such as Cogndum, Treeno, Databricks, and modern data science platforms.
Responsibilities
- Design, develop, and maintain data models using Cogndum and Treeno, ensuring they align with business and technical requirements.
- Build scalable data pipelines and ETL workflows using Databricks to ingest, process, and transform large datasets.
- Architect and manage Data Lakes for structured and unstructured data, ensuring data availability, integrity, and governance.
- Write and optimize SQL queries for data analysis, reporting, and integration across multiple systems.
- Develop and implement Python-based data science workflows, including preprocessing, modeling, and deployment of ML solutions.
- Use Java for back-end integrations, APIs, or high-performance data processing where required.
- Partner with business stakeholders, data scientists, and engineers to translate requirements into technical solutions.
- Ensure compliance with data governance, security, and regulatory standards.
- Prepare and maintain clear documentation of data models, workflows, and processes.
Required Skills & Qualifications
- 5+ years of experience in data modeling, data engineering, or data analytics.
- Hands-on expertise in Data Modeling using Cogndum and Treeno.
- Strong experience with Databricks for building pipelines, transformations, and analytics.
- Proficiency in Data Lakes architecture and implementation.
- Advanced knowledge of SQL for complex queries, tuning, and reporting.
- Strong programming experience with Python (data wrangling, ML workflows) and Java (back-end integration, processing).
- Experience in data science workflows, including exploratory data analysis (EDA), feature engineering, and model integration.
- Excellent problem-solving, analytical, and communication skills.
Nice to Have
- Experience working with AWS cloud services (S3, Redshift, Glue, EMR, Lambda).
- Familiarity with Jupyter Notebooks for prototyping, data analysis, and collaborative development.
- Exposure to AI/ML projects and model deployment pipelines.
- Experience in agile methodologies and tools such as Jira/Confluence.
Pay: $50.00 - $70.00 per hour
Benefits:
- 401(k)
- Dental insurance
- Health insurance
- Vision insurance
Experience:
- Alteryx: 3 years (Required)
- Tableau: 3 years (Required)
- SQL: 3 years (Required)
- Data visualization: 3 years (Required)
Work Location: In person