Client looking for Data and Analytics Architect at Tallahassee, FL - Jobs via Dice : Job Details

Client looking for Data and Analytics Architect at Tallahassee, FL

Jobs via Dice

Job Location : Tallahassee,FL, USA

Posted on : 2025-08-14T00:59:34Z

Job Description :
Client looking for Data and Analytics Architect at Tallahassee, FL

Join to apply for the Data and Analytics Architect role at Jobs via Dice.

Our Fortune 500 client, Stellar IT Solution, is seeking a Data and Analytics Architect/Engineer to work on their project based out of Tallahassee, FL/Hybrid.

Job Title: Data and Analytics Architect/EngineerLocation: Tallahassee, FL/HybridDuration: 24 monthsJob Description:
  • Provide subject matter expertise across all EDAP components.
  • Reverse engineer and analyze current systems using Informatica CDP.
  • Document logical/physical data models using ERwin.
  • Conduct needs analysis and map business processes.
  • Collaborate with Data Governance for data standards and quality rules.
  • Support metadata management with Informatica MCC/CDGC.
  • Design and maintain target data models for data warehouse and lakehouse (Delta, Iceberg, Hudi).
  • Engineer data pipelines and source connectors with EDAP tools.
  • Implement IDMC 360 SaaS MDM and RDM solutions.
  • Integrate and support API-based data exchange solutions.
  • Collaborate with OIT to optimize Snowflake and Databricks environments.
  • Ensure HIPAA, security, and privacy compliance.
  • Design fine-grained access controls using RBAC, PBAC, ABAC models.
  • Participate in unit, integration, regression, and user acceptance testing.
Required Qualifications & Experience:Education:
  • Bachelor's degree in Computer Science, Data/Analytics, Information Systems, or related fields.
Certifications:
  • Certified Data Management Professional (CDMP) preferred.
  • Alternatively, 18+ hours of relevant webinars/conferences in the past 3 years.
Technical Skills (Required):
  • Data Modeling (ER, Logical, Conceptual, 3NF, Dimensional, Delta/Iceberg) 7+ years.
  • Metadata and Data Cataloging 2+ years.
  • SQL Programming 5+ years.
  • Python or OOP Language 3+ years.
  • AWS Services (S3, Glue, Lambda, RDS, Redshift) 2+ years.
  • Data Pipeline/Integration Tools 5+ years (ETL/ELT, CDC, Streaming).
  • Informatica Tools (CDI, CDQ, CDP, CDIR, MCC, CD IDMC) 2+ years.
  • Data Warehouse and Lakehouse Engineering (Delta Lake, Hudi, Iceberg) 2+ years.
  • Cloud Platforms (AWS, Snowflake, Databricks, Azure) 3+ years.
  • Cloud BI Tools ESRI, Qlik, Tableau.
  • Data Science/ML Tools Dataiku (2+ years), including statistical lifecycle and endpoints.
  • Security and Compliance HIPAA, data access controls (RBAC, ABAC).
  • Ability to translate complex business needs into data solutions.

Please send your updated Word format resume along with your best contact details.

Additional Details:
  • Seniority level: Mid-Senior level
  • Employment type: Full-time
  • Job function: Other
  • Industries: Software Development
#J-18808-Ljbffr
Apply Now!

Similar Jobs ( 0)