Position: Azure DevOps Data Engineer Location: Pinellas Park,FL Duration: Contract Interview: Skype Visa: USC/GC LOCAL FL ONLY A few additional notes When reading the initial job description portion, they state this person will combine “technical support” with engineering, designing the data stack. The focus of this role is certainly on the engineering, designing, almost architecting how their data stack will be best used using Synapse. The technical support verbiage is related to any gaps in functionality during the migration process when/if issues come up that would be client specific. This person may from time to time interact with third party clients technology staff, so they are looking for someone who is a strong communicator and will represent well Note one of the qualifications is for Insurance, Finance/Banking experience is a big plus for them, but not a deal breaker. Someone who has worked with high volume transactional datasets is acceptable. Regarding the Oracle experience, I think this might be the one item that will make this pool of candidates a bit challenging. I would say as long as they have any experience migrating high volume/complex data to Azure Synapse from any source would be someone I would at least want to put in front of them. hotlist Please share updated resume and hotlist on vipin@ramyinfotech.com, thanks #c2crequirements #corptocorp #usstaffing #usjobs #c2cusajobs #hotlists #c2cjobs #c2cjobs #resume #top10 #recruiters #recruitment #indeedjobs #indeedusa #usaindeed #connections #itbenchmarking #vendormanagement #updating #candidate #corps #recruiting #itrecruiters #itstaffing #contractual #jobs #candidatessearching #recruiters #hotlist #c2crequirements #availableconsultants #vendorlist #requirements #vendorempanelment #h1bvisa #jobsearch #opt #gc #subcontractors #benchsales #daily #matching #hiring #systemintegrators #marketing #jobseeking #lettings #benchmarking #vendor #recruiterjob
Vipin Kumar’s Post
More Relevant Posts
-
Hi there, I hope you all are doing well! #C2C #C2Crequirements #usitrecruitment #C2Cposition #urgentrequirements Please send resume to chhavi@smartiplace.net Position Title: Azure DevOps Data Engineer/Azure Synapse Architect Mode: onsite Location: Pinellas Park, FL (Must be local) Visa: USC,GC Project details: They are ultimately moving their entire database and data solutions from Oracle to Microsoft Azure Synapse data solutions. A few additional notes When reading the initial job description portion, they state this person will combine “technical support” with engineering, designing the data stack. The focus of this role is certainly on the engineering, designing, almost architecting how their data stack will be best used using Synapse. The technical support verbiage is related to any gaps in functionality during the migration process when/if issues come up that would be client specific. This person may from time to time interact with third party clients technology staff, so they are looking for someone who is a strong communicator and will represent well Note one of the qualifications is for Insurance, Finance/Banking experience is a big plus for them, but not a deal breaker. Someone who has worked with high volume transactional datasets is acceptable. Regarding the Oracle experience, I think this might be the one item that will make this pool of candidates a bit challenging. I would say as long as they have any experience migrating high volume/complex data to Azure Synapse from any source would be someone I would at least want to put in front of them.
To view or add a comment, sign in
-
Sr.Data Engineer | Actively looking for job Opportunities | Data Engineer | Data Migration | AWS services | Azure synapse | Azure Databricks | Data factory | SQL | Python | Pyspark | SparkSQL | Azure DevOps | Hadoop |
I'm Harshita, and I'm actively seeking a new opportunity as a Data Engineer. With over 8 years of experience in the field, In my diverse journey, I've made impactful contributions across Healthcare, Retail, and Insurance domains.Proficient in Cloud technologies and well-versed in Big Data platforms, I've navigated complex landscapes, optimizing data pipelines, and turning data into actionable insights.. My expertise spans the entire data lifecycle, from conception to implementation. Whether it's designing and deploying multi-tiered applications on AWS, configuring Spark jobs for optimal performance, or developing real-time customer engagement platforms, I've consistently delivered excellence and I've had the privilege of being a part of migrating applications from on-premises to Azure, leveraging Azure Databricks for efficient data processing. Currently, I'm looking for C2C/C2H roles at the moment. If you're interested in learning more about my experience and how I can contribute to your team, I'd love to connect and have a conversation. Feel free to reach out to me at +1 480-409-8695 or mharshita356@gmail.com Thank you for considering me, and I look forward to the possibilities ahead. NOTE: Open FOR DAY1 ONSITE #DataEngineeringJobs #BigDataJobs #DataJobs #C2CJobs #C2HJobs #HybridWork #RemoteJobs #OnsiteOpportunities #JobSearchTech #DataAnalyticsRoles #TechJobSeekers #OpenForNewOpportunities #BigDataCareer #JobSearch2024 #cloud #awsdataengineer #azuredataengineer #azurecloud #awscloud #sql #scala #apachespark #hadoop #connectwithme
To view or add a comment, sign in
-
#hotlists Hello LinkedIn, Good Morning, Hope you are doing great..!! Here Is the Updated Hotlist of our #c2cconsultants . Kindly let me know if you have any suitable #c2crequirement and reach at Rakesh.sales0@gmail.com and you can call me on my number i.e.., +1 (754)-240-5253 Please add my mail id to your database and share the requirements on daily basis. Here are the technologies: Cloud Data Engineer 10 AWS Data Engineer 10 Big Data Engineer 10 #hiringrecruiters #itrecruiters #usrecruiters #technicalrecruiter #talentacquisition #usitrecruitment #talentacquisition #shares #recruiters #staffingsolutions #USITTalentacquisition #staffingandrecruiting #ITstaffing #c2c #usitrecruiters #c2chiring #recruitmentleaders #corptocorp #corp2corp #jobsearching #recruiting #linkedinjobs #networking #lookingforjob #searchstring #recruiters #recruitment #jobseeking #recruitmentcareers #hotlist #pythondeveloper #javadeveloper #AWScertifiedDeveloper #C2CJobs #ITJobs #USJobs #Recruitment #TechJobs #ContractJobs #Hiring #ITRecruitment #USRecruiter #JobOpening #ITConsulting #ConsultingJobs #ContractorJobs #TechnologyJobs #Staffing #JobOpportunity #NowHiring #ITStaffing #ContractWork #RecruitingLife #C2C #Profiles #c2ccandidates #opentowork #openforjobs #C2Crequirements #resumes #benchsales #c2cvendors #stateclient #JavaDeveloperJobs #SoftwareEngineering #JavaProgramming #FullStackDeveloper #BackendDeveloper #FrontendDeveloper #SoftwareDeveloper #JavaDevelopment #ProgrammingJobs #LinkedInJobs #TechJobs #JobOpening #JobSearch #JavaDeveloper #SoftwareDevelopment #JobOpportunity #TechCareer #JavaCoding #SoftwareEngineeringJobs #JavaJobs #JobPosting #JavaProgrammer #DeveloperOpening #LinkedInCareer #TechTalent #TechJobListing #JavaDeveloperPosition #JavaJobListing #TechHiring #JavaCodingJobs #JobOpenings #SoftwareDeveloperJobs #JavaProgrammingJobs #JavaDevelopmentJobs #ITJobs #SoftwareJobs #LinkedInOpportunity #JobSeeking #JobListing# #JavaDeveloperRole #JavaDeveloperPosition #JavaJobListing #SoftwareDevelopmentJobs #JavaCodingJobs #FrontendDevelopment #BackendDevelopment #FullStackDevelopment #LinkedInJobs #TechJobs #JobOpening #HiringNow #JobSearch #JavaDeveloper #TechJobs #SoftwareEngineering #JobOpening #JavaProgramming #JavaJobs #DeveloperJobs #JavaDeveloper #C2CPositions #Corp2Corp #JobOpportunity #JavaProgrammer
To view or add a comment, sign in
-
Greetings, LinkedIn Community! I hope this message finds you thriving and in good spirits. I'm Aswith , a seasoned Data Engineer with a rich tapestry of over a decade's worth of experience in the field. My journey in the realm of data engineering has been marked by a deep dive into Big Data technologies, where I've navigated and conquered challenges with AWS, Azure, and GCP at the forefront. My forte lies in architecting robust data pipelines and orchestrating seamless data migrations to cloud environments. Across industries spanning healthcare, insurance, and banking, I've had the privilege of leaving my mark through significant contributions. Currently, I'm on the lookout for new opportunities in Data Engineering on a C2C/C2H basis. If your organization is on the lookout for a dynamic professional to drive data initiatives forward, or if you simply have valuable insights to share, I'd love to connect and explore synergies. Feel free to reach out to me at paswith797@gmail.com or 8722402068. Your time and consideration are greatly appreciated, and I eagerly anticipate engaging with this vibrant professional community. Warm regards, Aswith #opentowork #opportunities #career #hiring #recruitment #AWS #azurecloud #dataengineer #dataanalyst #bigdataengineer #cloudengineer #python #sqlserver #jobsusa #c2crequirement #jobalert #usajobs #contractjobs #itrecruiters #recruiters #technicalrecruiter #c2c #c2cOppurtunities #c2cjobs #c2crequirements #corptocorp #corp2corp #corp2hire #corptohire #c2h #oppurtunities #c2cvendor #c2cusajobs #c2cconsultant #c2croles #ITRecruiter #TechRecruiter #Hiring #TechJobs #JobOpportunities #ITJobs #Recruitment #TechTalent #ITIndustry #JobSeekers #TechCareer #ITNetworking #JobSearch #TechHiring #Recruiting #DataEngineer #BigData #ETL #DataProcessing #DataWarehousing #DataPipeline #ApacheSpark #Hadoop #Python #Scala #SQL #DataIntegration #DataModeling #DataArchitecture #StreamingAnalytics #BatchProcessing #DataWarehouse #ETLJobs #DataEngineeringJobs
To view or add a comment, sign in
-
#hiring #w2 #remote Microsoft Fabric ETL Developer - Fully Remote Email: ajay@foxprotech.com (or) DM Ajay Skills: High proficiency in Microsoft Fabric and related ETL tools (e.g., Azure Data Factory) Knowledge of database systems (e.g., SQL Server, Azure SQL Database, Synapse Analytics) and understanding of data warehousing concepts and architecture. Experience with data modeling and schema design. Familiarity with programming languages used in ETL processes (e.g., Python, Pyspark). Strong understanding of data engineering principles, including data modeling, data transformation, and data optimization. Strong SQL skills for data extraction, transformation, and querying. Knowledge of accounting principles and logic is highly beneficial. #hiringalert #hiringnow #w2 #w2jobs #MicrosoftFabric #ETL #etldeveloper #azure #azuresynapse #dataengineering #sql #remote #remotejobs #remotehiring #remotework #fullyremote
To view or add a comment, sign in
-
Greetings, LinkedIn Community! I hope this message finds you thriving and in good spirits. I'm Disha, a seasoned Data Engineer with a rich tapestry of over a decade's worth of experience in the field. My journey in the realm of data engineering has been marked by a deep dive into Big Data technologies, where I've navigated and conquered challenges with AWS, Azure, and GCP at the forefront. My forte lies in architecting robust data pipelines and orchestrating seamless data migrations to cloud environments. Across industries spanning healthcare, insurance, and banking, I've had the privilege of leaving my mark through significant contributions. Currently, I'm on the lookout for new opportunities in Data Engineering on a C2C/C2H basis. If your organization is on the lookout for a dynamic professional to drive data initiatives forward, or if you simply have valuable insights to share, I'd love to connect and explore synergies. Feel free to reach out to me at dishab423@gmail.com or 5518886929. Your time and consideration are greatly appreciated, and I eagerly anticipate engaging with this vibrant professional community. Warm regards, Disha #opentowork #opportunities #career #hiring #recruitment #AWS #azurecloud #dataengineer #dataanalyst #bigdataengineer #cloudengineer #python #sqlserver #jobsusa #c2crequirement #jobalert #usajobs #contractjobs #itrecruiters #recruiters #technicalrecruiter #c2c #c2cOppurtunities #c2cjobs #c2crequirements #corptocorp #corp2corp #corp2hire #corptohire #c2h #oppurtunities #c2cvendor #c2cusajobs #c2cconsultant #c2croles #ITRecruiter #TechRecruiter #Hiring #TechJobs #JobOpportunities #ITJobs #Recruitment #TechTalent #ITIndustry #JobSeekers #TechCareer #ITNetworking #JobSearch #TechHiring #Recruiting #DataEngineer #BigData #ETL #DataProcessing #DataWarehousing #DataPipeline #ApacheSpark #Hadoop #Python #Scala #SQL #DataIntegration #DataModeling #DataArchitecture #StreamingAnalytics #BatchProcessing #DataWarehouse #ETLJobs #DataEngineeringJobs
To view or add a comment, sign in
-
#hiring #w2 #remote Microsoft Fabric ETL Developer - Fully Remote Email: aishwarya@foxprotech.com Skills: High proficiency in Microsoft Fabric and related ETL tools (e.g., Azure Data Factory) Knowledge of database systems (e.g., SQL Server, Azure SQL Database, Synapse Analytics) and understanding of data warehousing concepts and architecture. Experience with data modeling and schema design. Familiarity with programming languages used in ETL processes (e.g., Python, Pyspark). Strong understanding of data engineering principles, including data modeling, data transformation, and data optimization. Strong SQL skills for data extraction, transformation, and querying. Knowledge of accounting principles and logic is highly beneficial. #hiringalert #hiringnow #w2 #w2jobs #MicrosoftFabric #ETL #etldeveloper #azure #azuresynapse #dataengineering #sql #remote #remotejobs #remotehiring #remotework #fullyremote
To view or add a comment, sign in
-
#Immediate Openings Azure Cosmos DB and Azure Data Factory (ADF) Specialist @: Remote(USA)(Except CPT/OPT any Visa fine) Hello Folks, I Hope you are doing well. Please find the below Job Descriptions and revert me with a suitable resume. Job Title: Azure Cosmos DB and Azure Data Factory (ADF) Specialist Location: Remote Duration: Long Term Contract Except CPT/OPT any Visa fine Azure Cosmos DB and Azure Data Factory (ADF) Specialist who has at least 10-15 years of IT experience with 6+ years of Cosmos Mongo DB experience SUMMARY: Ability to Design and implement data models and data distribution Integrate, Optimize and Maintain an Azure Cosmos DB solution Ability to Create appropriate indexing policies. Interpret JSON Document. Design, develop, and maintain data integration solutions using Azure Data Factory. Develop and optimize data extraction, transformation, and loading (ETL) processes involving Oracle databases. Create robust and scalable data pipelines to support business needs. Mandatory skills :- 1) Proficient with a deep understanding of Azure Cosmos DB with MongoDB API features, architecture, and best practices is essential. This includes knowledge of CRUD operations, indexing, aggregation framework, and data modelling. 2) Data Modelling: Ability to design Azure Cosmos DB with MongoDB API document schemas that reflect the application's data requirements and access patterns. 3) Database Management: Experience in database administration tasks such as installation, configuration, backup and recovery, performance tuning, and security management. 6) Monitoring and Troubleshooting: Proficiency in monitoring Azure Cosmos DB with MongoDB API performance metrics, diagnosing issues, and resolving them. 7) Scripting and Automation: Skills to automate routine tasks and manage Azure Cosmos DB with MongoDB API deployments. 8) Familiarity with command-line interface for server management and troubleshooting. 9) Cloud Platforms: Understanding of cloud platforms (such as Azure) and experience deploying Azure Cosmos DB with MongoDB API on these platforms. 11) Backup and Recovery: Knowledge of backup strategies and tools for MongoDB, and the ability to recover data in case of failure. 12) High Availability and Scalability: Experience in configuring Azure Cosmos DB with MongoDB API for high availability and scalability using features like replica sets and sharding. 13) Knowledge in Data transformation from Oracle SQL db. to Cosmos mongo Document db. Secondary Skill Hands on experience in Azure administration Please share resume to jiten@sharpitco.com
To view or add a comment, sign in