Req ID: RQ207119
Type of Requisition: Regular
Clearance Level Must Be Able to Obtain: None
Public Trust/Other Required: None
Job Family: IT Infrastructure and Operations
Skills:
Building Architecture,DevOps,Innovation
Experience:
15 + years of related experience
Job Description:
Job Description
SOLUTIONS ARCHITECT SME
Support mission-critical initiatives and enable the growth of our business as a Solutions Architect SME at GDIT. Here, you'll become an integral part of how GDIT is able to ensure the safety and security of our nation.
MEANINGFUL WORK AND PERSONAL IMPACT
As a Solutions Architect SME, the work you'll do at GDIT will be impactful to the mission of the FAA. You will play a crucial role in The FAA's Enterprise Information Management (EIM) platform is a strategic initiative, and its mission is to enable the agency to operate as an "information-centric enterprise". The program applies governance and management best practices to ensure information is effectively and efficiently delivered as a service.
WHAT YOU'LL NEED TO SUCCEED
AWS Cloud Cost Optimization & Management
Conducted detailed reviews of EDP AWS account dashboards focusing on:
Identifying cost-saving opportunities by analyzing underutilized EC2 instances, including rightsizing workload instances to better match actual compute demand.
Discovering and purging orphaned or unused Elastic Block Store (EBS) volumes to reduce unnecessary storage costs.
Assessing Amazon S3 lifecycle policies for optimizing storage costs through tiered storage and archival strategies.
Recommended broader cost governance strategies such as:
Adoption of AWS Reserved Instances and Compute & EC2 Savings Plans to lower compute costs through long-term commitments.
Use of serveless computing for applicable workloads to maximize cost savings on idle capacity.
Setting up budgets, alerts, and ongoing monitoring to ensure spend stays within stakeholder expectations.
Leveraging AWS Cost Explorer, Trusted Advisor, and CloudWatch metrics to continuously track resource utilization and cost anomalies.
Emphasized the importance of infrastructure automation in cost management through scheduled asset shutdowns and infrastructure as code (IaC) practices for environment provisioning and tear-down.
Promoted reviewing and rightsizing storage including pruning EBS snapshots and managing S3 Intelligent tiering.
Recommended elimination of orphaned Elastic IPs and idle Load Balancers.
Encouraged migration to the latest AWS instance families (e.g. Graviton) for better price/performance ratios and using R family instance for Starburst.
Supported negotiation of cost management of AWS Enterprise Data Platform (EDP) contracts:
Understanding committed spend thresholds and discount tiers.
Balancing over-/under-usage clauses to avoid unforeseen costs.
Advocated embedding financial awareness within engineering teams to drive cost-conscious development through transparency and collaborative ownership of cloud bills.
Security, Access Control, and Compliance
Performed multiple security finding reviews to identify issues, advising security officers on implementing timely, best-practice aligned fixes including:
Enforcing least privilege access via Identity and Access Management (IAM) roles and multi-factor authentication.
End-to-end data encryption for SaaS tool ingestion pipelines, both while in transit and at rest.
Using zero-trust security principles to limit lateral movement and increase segmentation.
Real-time security monitoring and alerts using CloudWatch and AWS GuardDuty.
Designed secure connectivity for vendor systems hosted in the AWS GovCloud partition by leveraging AWS PrivateLink for one-way, secure data transfer to the EIM Data Platform, following architecture vetted for the CRE MITRE team.
Led design and review of cross-account Amazon Managed Streaming for Apache Kafka (MSK) access solutions between EIM and SVT accounts, presented viable approaches to the team covering:
AWS Security Token Service (STS) AssumeRole with temporary credentials for fine-grained, tightly controlled access.
Cross-account VPC connectivity using PrivateLink, ensuring secure and isolated data streams.
Trade-offs between AssumeRole and OpenID Connect Federation (OIDC), highlighting flexibility, auditability, complexity, and risk.
Successfully collaborated with John H and EDP support team to implement the selected cross-account MSK access solution using the above methods.
Collaborated with Security officer on the System Impact Assessment (SIA) process to enable the EIM Data Challenge program to support external data challenge users outside standard EDP authentication boundaries, ensuring compliance and risk mitigation.
AI/ML and Computational Environment Enhancements
Worked closely with AI/ML engineering teams to stabilize JupyterLab, mitigating pain points related to usability and system reliability.
Presented and assessed alternatives to JupyterHub environments including:
Kubernetes-based JupyterHub on Amazon EKS for scalable, multi-tenant deployments.
Jupyter Enterprise Gateway for kernel scaling and notebook serving.
Anaconda Team Notebooks offering collaborative notebooks with enterprise features.
AWS SageMaker Studio for managed, integrated ML development providing notebook capabilities.
Amazon EMR Studio to leverage Spark for analytical notebooks.
Supported ADO 200 team to build streamlined deployment pipelines for Ollama Runtime-based models:
Deployed Aviation Safety machine learning model on an Amazon EC2 G6 GPU instance running Ubuntu OS, utilizing CUDA libraries for GPU acceleration. The G6 instance, powered by NVIDIA L4 Tensor Core GPUs, delivers high performance optimized for deep learning inference, real-time AI processing, and scientific simulations.
Automated deployment using Amazon S3 event triggers that initiate the launch of Amazon EC2 compute instances as needed for model processing. The automation reads deployment specifications from the uploaded S3 object tags (specifically, the ModelInstanceType tag) to dynamically select the appropriate EC2 instance @type. This setup enables scalable, event-driven launching of compute resources tailored for repeated machine learning workflows, optimizing resource utilization and speeding up model execution without manual intervention.
Worked with ADO 200 to:
Created EDP AI/ML Engineer Azure AD app integrated with AWS EDP Identity Provider.
Developed a python streamlit based application designed for system administrators to provision Amazon SageMaker notebooks for EDP customers who are interested in exploring SageMaker's machine learning capabilities. This tool streamlines notebook creation, configuration, and access provisioning, enhancing user onboarding and enabling faster experimentation within the EDP environment.
Enabled EDP users to seamlessly assume AWS Console roles with AI/ML privileges. This role grants controlled access to Amazon SageMaker instances as well as Amazon Bedrock Foundation models, limited specifically to Amazon-provided models. The access scope is governed through fine-grained IAM policies that restrict users to approved SageMaker and Bedrock resources, ensuring secure and compliant usage within the EDP ecosystem.
Designed and developed an agent-based AI chatbot focused on Contract Data. This chatbot assists executive leadership by providing quick, accurate insights and answers, enabling faster and more informed decision-making. The solution leverages AI to interpret contract details and deliver relevant information in an interactive format
Platform Development & Integration
Designed and proposed implementation of automated metadata ingestion integrations between Starburst and Collibra:
Provided mechanisms for Starburst schema discovery and sync to Collibra's data catalog.
Promoted governance automation to enhance data catalog completeness.
Supported the Privacera upgrade effort by reviewing release notes, migration, and rollback strategies with Kyle and John H, contributing architectural inputs to minimize downtime and data policy desynchronization.
Architected self-service provisioning pipelines for AWS resources (EC2, SageMaker, RDS):
ServiceNow portal serves as front-end for user resource requests.
EDP-hosted API Gateway endpoints receive calls, authenticated with tokens.
Lambda functions orchestrate launch templates, CloudFormation stacks, or Terraform APIs to provision requested infrastructure.
Partial completion with ongoing work on full feature rollout.
Deployed CloudBolt infrastructure in EDP non-prod setup for assessing commercial off-the-shelf (COTS) self-service platforms, providing support to Anteneh and Fritz throughout.
Coordinated with RAISE team reviewing commercial platform requirements for seamless production imports into EDP with full architectural compatibility.
Identity, Governance & Self-Service Enablement
Partnered with Anteneh to define ServiceNow groups for EDP AWS user management and onboarding.
Developed ServiceNow-based self-service forms enabling EDP users to request and provision AWS resources in a governed manner.
Leveraged AWS Service Management Connector for seamless integration of AWS service catalog and provisioning within ServiceNow.
Supported Denodo component review and guided integration with Starburst to enable federated queries and data virtualization.
Architected ingestion workflows to collect SaaS tool logs securely into EDP:
API-based log retrieval.
Centralized data storage on S3.
Stream processing via Lambda.
Indexing and search with Amazon OpenSearch for operational intelligence.
Data Lake Technologies & Ecosystem
Conducted thorough evaluation of Apache Iceberg, Apache Hudi, and Databricks Delta Lake with focus on:
Architectural differences such as snapshot-based metadata (Iceberg) versus copy-on-write with incremental ingestion (Hudi) and MLflow integration (Databricks).
Strengths in batch processing (Iceberg), real-time ingestion (Hudi), streaming and batch hybrid (Delta).
Performance and scalability: Large batch workloads (Iceberg), frequent updates and low latency (Hudi), high performance for interactive queries (Delta).
Ecosystem integration with Hive, Spark, Presto, cloud storage.
Community support and backing (e.g., Netflix for Iceberg, Uber for Hudi, Databricks for Delta).
Evaluated AWS and other cloud alternatives to Databricks analyzing gaps in unification, collaboration, AutoML capabilities:
Noted that despite EDP's broad toolset (EMR Presto, JupyterHub, SageMaker, Starburst), Databricks excels at integrated end-to-end workflows, collaboration, and AutoML with MLflow.
Advised on how Tyrion's offerings complement EDP containerized solutions leveraging Starburst data access.
Data Products & Data Mesh
Worked with AHR team explaining Starburst's data products and data mesh features.
Supported onboarding data from EDR to EDP to enable AHR team's full utilization of data product capabilities.
Modeled effort based on similar initiatives for ATO Data Products.
Collaboration & Advocacy
Actively engaged Starburst team advocating for enhancements aligned with EDP customer feedback.
Collaborated with Alteryx SaaS team and Kyle to review architectural integration and governance for EDP alignment.
Guided RAISE team on commercial platform solution adoption into EDP production.
Additional Contributions
Proposed hardened container images from IronBank for EDP security compliance.
Clarified Tyrion integration for container-based applications querying EDP data.
Designed secure cross-account MSK access solutions leveraging AWS PrivateLink and IAM roles.
Coordinated with ServiceNow teams for self-service AWS provisioning forms and group definitions.
Provided technical guidance to team members as needed to resolve complex issues in timely fashion
Lucid Architecture Diagrams:
Shared all the diagrams that I have produced till date on Lucidgov with Anteneh Solomon
GDIT IS YOUR PLACE
At GDIT, the mission is our purpose, and our people are at the center of everything we do.
Growth: AI-powered career tool that identifies career steps and learning opportunities
Support: An internal mobility team focused on helping you achieve your career goals
Rewards: Comprehensive benefits and wellness packages, 401K with company match, competitive pay and paid time off
Flexibility: Full-flex work week to own your priorities at work and at home
Community: Award-winning culture of innovation and a military-friendly workplace
OWN YOUR OPPORTUNITY
Explore a career at GDIT and you'll find endless opportunities to grow alongside colleagues who share your sense of ownership and pride in the meaningful work we do.
The likely salary range for this position is $161,500 - $218,500. This is not, however, a guarantee of compensation or salary. Rather, salary will be set based on experience, geographic location and possibly contractual requirements and could fall outside of this range.
Our benefits package for all US-based employees includes a variety of medical plan options, some with Health Savings Accounts, dental plan options, a vision plan, and a 401(k) plan offering the ability to contribute both pre and post-tax dollars up to the IRS annual limits and receive a company match. To encourage work/life balance, GDIT offers employees full flex work weeks where possible and a variety of paid time off plans, including vacation, sick and personal time, holidays, paid parental, military, bereavement and jury duty leave. GDIT typically provides new employees with 15 days of paid leave per calendar year to be used for vacations, personal business, and illness and an additional 10 paid holidays per year. Paid leave and paid holidays are prorated based on the employee's date of hire. The GDIT Paid Family Leave program provides a total of up to 160 hours of paid leave in a rolling 12 month period for eligible employees. To ensure our employees are able to protect their income, other offerings such as short and long-term disability benefits, life, accidental death and dismemberment, personal accident, critical illness and business travel and accident insurance are provided or available. We regularly review our Total Rewards package to ensure our offerings are competitive and reflect what our employees have told us they value most.
We are GDIT. A global technology and professional services company that delivers consulting, technology and mission services to every major agency across the U.S. government, defense and intelligence community. Our 30,000 experts extract the power of technology to create immediate value and deliver solutions at the edge of innovation. We operate across 50 countries worldwide, offering leading capabilities in digital modernization, AI/ML, Cloud, Cyber and application development. Together with our clients, we strive to create a safer, smarter world by harnessing the power of deep expertise and advanced technology.
Join our Talent Community to stay up to date on our career opportunities and events at https://gdit.com/tc.
Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans