Data Engineer - Atlantic Group : Job Details

Data Engineer

Atlantic Group

Job Location : New York,NY, USA

Posted on : 2025-04-27T23:22:49Z

Job Description :

Our client, a real estate and asset-based lender, is looking to hire a full-time Analytics Engineer to work onsite out of their Midtown Manhattan location.

This is a dynamic team focusing on optimizing the firm's asset management operations and business intelligence (BI) capabilities. This role combines technical data engineering expertise with analytical science skills to drive data-informed decision-making across their portfolio.

Responsibilities:

  • Build automated reporting systems and interactive dashboards for portfolio monitoring, including custom analyses for executive leadership, asset management, and origination
  • Implement machine learning (AI) models for asset valuation, market analysis, and investment opportunity screening
  • Build and optimize Snowflake databases and queries to support real-time business intelligence needs
  • Design and implement quality assurance processes for data extraction, transformation, and analysis workflows
  • Design and maintain scalable data pipelines in Nexla and Python to integrate property management systems, financial databases, and market data feeds into our Snowflake data warehouse
  • Develop and implement OCR/NLP models to extract, validate, and classify key information from loan agreements, property reports, and other financial documents
  • Create predictive models to identify asset performance trends, risks, and opportunities across the real estate portfolio, with a focus on occupancy rates and NOI metrics
  • Design and optimize ETL processes to ensure data quality/consistency, with robust monitoring and alert systems

Qualifications:

  • Bachelor's or Master's Degree in Computer Science, Data Science, or related field with 3-7 years of experience; additional experience may be considered in lieu of degree
  • Expert-level Python programming with strong proficiency in data science libraries (pandas, numpy, scikit-learn) and ML frameworks (TensorFlow, PyTorch)
  • Experience building and optimizing ETL pipelines using modern data platforms (they use Nexla) and working with Snowflake or similar cloud data warehouses
  • Demonstrated experience with large language models (LLMs), prompt engineering, and NLP frameworks (e.g., Hugging Face Transformers) for document processing and information extraction
  • Proficiency in data preprocessing, cleaning, and transformation techniques for both structured and unstructured data sources
  • Experience with supervised and unsupervised learning algorithms, model evaluation metrics, and ML deployment in production environments
  • Advanced SQL expertise, particularly with Snowflake, including optimization and security best practices
Apply Now!

Similar Jobs ( 0)