DataAnnotationSan Diego
DataAnnotation is committed to creating quality AI. Join our team to help train AI chatbots while gaining the flexibility of remote work and choosing your own schedule.
We are looking for a proficient *programmer* to join our team to train our AI...
Katalyst Healthcares & Life SciencesSan Diego
in computer science, mathematics, statistics, or related discipline and 3+ years' experience in the Biopharmaceutical (or CRO) industry as a statistical programmer with advanced knowledge of SAS, (Base, Stat, Macro, graph). In addition, you must have at least...
Calsoft LabsSan Diego
Job Description: Top 5 Required Skills
1. Tableau Dashboarding and governance
2. Python - Scripting
3. Redshift Databases
4. Data Modelling
5. SQL
Technologies (specific tools and/or programs you would want to see on their resumes)
" AWS
"...
Artech LLCSan Diego
Job Description: Top 5 Required Skills
1. Tableau Dashboarding and governance
2. Python - Scripting
3. Redshift Databases
4. Data Modelling
5. SQL
Technologies (specific tools and/or programs you would want to see on their resumes)
" AWS
"...
Katalyst Healthcares & Life SciencesCarlsbad, 30 mi from San Diego
and utilities to automate standard and frequent tasks, enhance quality and efficiency.
Requirements:
A minimum of a bachelor's degree in a scientific, computer science or related field, training in statistics preferred.
A Principal Statistical Programmer...
PEAK Technical Services Inc.Oceanside, 35 mi from San Diego
ESSENTIAL DUTIES AND RESPONSIBILITIES
Contributes to continuous improvement & development projects
Provides design for manufacturing/CNC process input for quote completion.
Develops manufacturing processes by studying product requirements,...
SynergisticITChula Vista, 9 mi from San Diego
Kroger, the Walt Disney Company and hundreds more. If you are tired of working with inefficient programmers who take a lot of time to ramp up we want you to try us. Our software programmers can hit the ground running and get you the maximum return on your...
SAICSan Diego
SAIC is seeking an experienced Software Developer in support of our United States Air Force customer. This position will assist in the design and develop front-end user interfaces that integrate with cloud-based software systems that align with modern DevSecOps and Agile processes.
Writes program co...
Get new jobs by email!
Get email updates for the latest Programmer jobs in San Diego
It's free, and you can cancel email updates at any time
IT Programmer Analyst - Programmer Analyst|1911 Programmer Analyst|1911
San Diego | www.resume-library.com |
Job Description: Top 5 Required Skills
1. AWS
2. Data Engineering
3. SQL
4. Python
5. Informatica
Technologies (specific tools and/or programs you would want to see on their resumes)
" AWS, Redshift, Apache Airflow, Glue, Python, Informatica, Snowflake, IICS, ERP, Salesforce
Keywords (specific words, projects, programs you would want to see on their resumes)
" AWS, Redshift, Apache Airflow, Glue, Python, Informatica, Snowflake, IICS, ERP, Salesforce
" Data discovery, modeling, engineering, pipeline, data warehouse
Education Requirement
" Bachelor of Science in Computer Science, Information Technology, Data Science, or a related field.
Required Years of Experience
" 3 years min
Physical Requirements
" Push Max Weight Limit = 0
" Pull Max Weight Limit = 0
" Lift Max Weight Limit = 0
" Forklift Required (Y/N): N
Driving Requirements
" Are there driving responsibilities no matter how minimal with this role? No
" (If Yes)How many hours per week? n/a
" Work Location Requirement: 100% Onsite
" Work Address: 5775 Morehouse Drive, San Diego, CA
" Home Building: O
" Work Days: Mon-Fri
" Exact Shift Time: 8:30-5:00pm PST
" Weekly / Daily Expected Hours: 40.0 / 8.0
As a Data Engineer at , you will play a pivotal role in shaping and implementing robust data architecture while constructing efficient data pipelines aligned with our organizational goals. This position supports various business functions, including Supply Chain, Finance, Sales & Marketing, and other corporate areas.
The ideal candidate will possess a deep understanding of data architecture principles, data discovery/modeling, and the seamless integration of emerging technologies into a cohesive data ecosystem that fosters innovation, operational excellence, and strategic insights.
Principal Duties and Responsibilities:
" Collaborate with cross-functional and global teams to comprehend business requirements and translate them into effective solutions.
" Design and manage resilient, scalable data models and architectures that cater to the evolving needs of the business, emphasizing efficiency, quality, and security.
" Implement data pipelines using tools such as Apache Airflow, AWS Glue, Redshift, S3, Python, and Informatica Cloud.
" Facilitate communication within and outside the project team to resolve conflicts related to implementation schedules, design complexities, and other challenges.
Good to have qualifications
" Experience in Data Engineering leveraging AWS Redshift, S3, Glue, Airflow, Python, SQL and Kubernetes
" Familiarity with Informatica Cloud and Informatica Power Center is essential.
" Strong expertise in data modeling tools and methodologies, encompassing both structured and unstructured data environments.
" Demonstrated experience in data governance, data quality management, and data security practices.
" Exceptional analytical and problem-solving skills, enabling the translation of intricate technical challenges into actionable solutions.
" Ability to quickly learn and adapt to new technologies.
" Strong sense of ownership and growth mindset.
" Curiosity about the problem domain and an analytical approach.
" Strong influence skills to drive business adoption and change.
" Experience in data discovery of source systems like Oracle E-Business Suite and Salesforce is preferable.
Comments for Suppliers:
1. AWS
2. Data Engineering
3. SQL
4. Python
5. Informatica
Technologies (specific tools and/or programs you would want to see on their resumes)
" AWS, Redshift, Apache Airflow, Glue, Python, Informatica, Snowflake, IICS, ERP, Salesforce
Keywords (specific words, projects, programs you would want to see on their resumes)
" AWS, Redshift, Apache Airflow, Glue, Python, Informatica, Snowflake, IICS, ERP, Salesforce
" Data discovery, modeling, engineering, pipeline, data warehouse
Education Requirement
" Bachelor of Science in Computer Science, Information Technology, Data Science, or a related field.
Required Years of Experience
" 3 years min
Physical Requirements
" Push Max Weight Limit = 0
" Pull Max Weight Limit = 0
" Lift Max Weight Limit = 0
" Forklift Required (Y/N): N
Driving Requirements
" Are there driving responsibilities no matter how minimal with this role? No
" (If Yes)How many hours per week? n/a
" Work Location Requirement: 100% Onsite
" Work Address: 5775 Morehouse Drive, San Diego, CA
" Home Building: O
" Work Days: Mon-Fri
" Exact Shift Time: 8:30-5:00pm PST
" Weekly / Daily Expected Hours: 40.0 / 8.0
As a Data Engineer at , you will play a pivotal role in shaping and implementing robust data architecture while constructing efficient data pipelines aligned with our organizational goals. This position supports various business functions, including Supply Chain, Finance, Sales & Marketing, and other corporate areas.
The ideal candidate will possess a deep understanding of data architecture principles, data discovery/modeling, and the seamless integration of emerging technologies into a cohesive data ecosystem that fosters innovation, operational excellence, and strategic insights.
Principal Duties and Responsibilities:
" Collaborate with cross-functional and global teams to comprehend business requirements and translate them into effective solutions.
" Design and manage resilient, scalable data models and architectures that cater to the evolving needs of the business, emphasizing efficiency, quality, and security.
" Implement data pipelines using tools such as Apache Airflow, AWS Glue, Redshift, S3, Python, and Informatica Cloud.
" Facilitate communication within and outside the project team to resolve conflicts related to implementation schedules, design complexities, and other challenges.
Good to have qualifications
" Experience in Data Engineering leveraging AWS Redshift, S3, Glue, Airflow, Python, SQL and Kubernetes
" Familiarity with Informatica Cloud and Informatica Power Center is essential.
" Strong expertise in data modeling tools and methodologies, encompassing both structured and unstructured data environments.
" Demonstrated experience in data governance, data quality management, and data security practices.
" Exceptional analytical and problem-solving skills, enabling the translation of intricate technical challenges into actionable solutions.
" Ability to quickly learn and adapt to new technologies.
" Strong sense of ownership and growth mindset.
" Curiosity about the problem domain and an analytical approach.
" Strong influence skills to drive business adoption and change.
" Experience in data discovery of source systems like Oracle E-Business Suite and Salesforce is preferable.
Comments for Suppliers:
Best jobs you don't want to miss: