Virtual Hiring Event ROUND 2!! Pantheon Data will be hosting a 𝐕𝐢𝐫𝐭𝐮𝐚𝐥 𝐇𝐢𝐫𝐢𝐧𝐠 𝐄𝐯𝐞𝐧𝐭 to find talented individuals for a variety of positions NEXT WEEK! 🚀 📅 𝐄𝐯𝐞𝐧𝐭 𝐃𝐚𝐭𝐞: Tuesday, August 6th (9am-5pm EST) 📍 𝐋𝐨𝐜𝐚𝐭𝐢𝐨𝐧: Online (from the comfort of your home!) Ready to take your career to the next level? Join us and become part of a team that's driving innovation and excellence in the tech industry! 🌐 𝐖𝐞'𝐫𝐞 𝐡𝐢𝐫𝐢𝐧𝐠 𝐟𝐨𝐫 𝐦𝐮𝐥𝐭𝐢𝐩𝐥𝐞 𝐫𝐨𝐥𝐞𝐬, 𝐢𝐧𝐜𝐥𝐮𝐝𝐢𝐧𝐠 👇 AWS Test Automation Engineer - https://lnkd.in/gYfdNege Salesforce Developer - https://lnkd.in/gQVYDijm Senior BI Developer - https://lnkd.in/gMGRe7Ve Senior DevSecOps Engineer - https://lnkd.in/gRUdhV_q Data Integration Developer - https://lnkd.in/g5HNcMZH Data Integration Engineer - https://lnkd.in/gibPp4PB Connect with our talent team to keep a pulse on additional job opportunities: Siesar Reeves-James, Christopher Santiago, Cheri Hostetler, PHR, SHRM-CP, Bridget Mundy, David Cranfield, Rachel Frumkin , Sam Diab . . . #RemoteITJobs #NowHiring #AWS #BIJobs #AWSRemoteJobs #WFH #DataIntegration #TechRecruiter #DeveloperJobs #Developer #PantheonData
Pantheon Data’s Post
More Relevant Posts
-
#Still #hiring for #Data #engineer #c2c #local to CA #12+years Location: Sunnyvale, CA. (Need Local candidates only) on C2C Duration: 12+ Months (Possible Extention) Experience: 12+ Years Job Description: What you'll do: • You will use cutting edge data engineering techniques to create critical datasets and dig into our mammoth scale of data to help unleash the power of data science by imagining, developing, and maintaining data pipelines that our Data Scientists and Analysts can rely on. • You will be responsible for contributing to an orchestration layer of complex data transformations, refining raw data from source into targeted, valuable data assets for consumption in a governed way. • You will partner with Data Scientists, Analysts, other engineers, and business stakeholders to solve complex and exciting challenges so that we can build out capabilities that evolve the marketplace business model while making a positive impact on our customers' and sellers’ lives. • You will design, develop and maintain highly scalable and fault-tolerant real time, near real time and batch data systems/pipelines that process, store, and serve large volumes of data with optimal performance. • You will build business domain knowledge to support the data need for product teams, analytics, data scientists and other data consumers. What you'll bring: • At least 4+ years of experience development of big data technologies/data pipelines • Experience with in big data technologies like Hadoop, Apache Spark (Scala preferred), Apache Hive, or similar frameworks on the cloud (GCP preferred, AWS, Azure etc.) to build batch data pipelines with strong focus on optimization, SLA adherence and fault tolerance. • Experience in writing SQL to analyze, optimize, profile data preferably in BigQuery or SPARK SQL • Strong data modeling skills are necessary for designing a schema that can accommodate the evolution of data sources and facilitate seamless data joins across various datasets. • Strong analytical and problem-solving skills are crucial for identifying and resolving issues that may arise during the data integration and schema evolution process. Nice to have from you: • Experience building complex near real time (NRT) streaming data pipelines using Apache Kafka, Spark streaming, Kafka Connect with a strong focus on stability, scalability and SLA adherence. • Good understanding of REST APIs – working knowledge on Apache Druid, Redis, Elastic search, GraphQL or similar technologies. Understanding of API contracts, building telemetry, stress testing etc. • Exposure in developing reports/dashboards using Looker/Tableau Experience in eCommerce domain preferred.now Email-rashmi@linktms.com
To view or add a comment, sign in
-
Hi , Hope all is well with you. Please find the urgent requirement I'm working on today and share the suitable resumes to sandhya.ej@incorporaninc.com Job Role - Analytics L3 Support Lead Location – Boston, MA (Onsite ) Position – C2H Mandatory Skills: Azure Data Factory, Data lake data bricks, CDP-QlikETL Integration and Google Big Query Must have skills: Azure Data Factory, Google Big Query, Redpoint Data Management. JOB DESCRIPTION Collaborates with Architect for product development by analyzing the requirement, development, testing and final delivery of product. Significant experience in Azure Technologies i.e., Azure ADF, IoT, Event Hub, Cosmos DB, SQL DB, Snowflake etc. Expert in Azure Big Data Architecture and Azure Ecosystem ecosystem. Leads small teams to deliver advanced data modelling and optimization at scale. Proficient in managing data from multiple sources. Adept at exploiting technologies to manage and manipulate data, scaling data models and solutions to support analytics of business insights. Can write high-performance, reliable, and maintainable code with Azure technologies. Proficient in setting up and working with huge Big Data clusters, cluster monitoring, and maintenance. Coordinates with Data Architect and Project manager to update about project status. Proficient in working with CDP (Customer Data Platform), Match & merge, Hierarchy maintenance etc. Knowledge in DevOps- integration framework, and design Pipelines integrated with Azure Devops. Understand repeated issues in Azure platform (Data, Analytics Pilelines) and provide automations to fix those issues permanently. #AzureDataFactory, #Google #BigQuery, #Redpoint #DataManagement. #AzureADF, #IoT, #EventHub, #CosmosDB, #SQLDB, #Snowflake #DevOpsintegration #framework, #design #Pipelinesintegrated #Azure Devops.
To view or add a comment, sign in
-
Hello Professionals, Good Evening! Hope you are doing great. This is Karthick from Sight Spectrum, we have an Immediate Hiring for our Fortune Clients with Immediate Interview slots available Title: #DataEngineer with Strong #AWS #QuickSight Location - #Raleigh, NC - Day 1 Onsite Contract Long Term Experience: 12+ Years Must Client Expectation – "I need more of an analytics person vs #ETL / data engineer with more experience presenting insights on #Dashboards and apps." Key skillset- Worked on #DataScience projects with sound analytical knowledge in providing business insights The resource should be able to perform data wrangling and transformation using #Python- and would need to be self-sustained to fix any bugs arising during the data transformation Experienced in creating Plotly dashboard with interactive application development knowledge with write back Experienced with hands on expertise in #AWS and #Databricks and should have worked on #DataLake and #DataWarehouse type of projects Having sound knowledge on creating and enhancing dashboards and reports using #QuickSight Share me the Resumes on karthickraja@sightspectrum.com #immediatejoiners #immediatehiring #immediate #c2crequirements #c2c #c2cvendors #c2cjobs #c2chotlist #c2cusajobs #c2crequirement #c2cavailable #c2crecruiters #c2chiring
To view or add a comment, sign in
-
Hello #LinkedIn I'm #Hiring. Let me know If you are interested. #Jobtile: #TechnicalDataAnalyst #Location: #Mounitainview, #Ca #Onsite #Backfill #Required: 9 yrs Quantitative Data Analysis is preferred Advanced #SQL skills to get the data you need from a #Datawarehouse and perform #Datasegmentation and aggregation from scratch. Strong with #TDA and #SQL coding #Dataquery and data processing tools/systems (e.g., #relational, #NoSQL, #streamprocessing) Familiarity with #AWS (#Redshift, #Athena, and #AWScoreconcepts) Familiarity with #Datamodeling Familiarity in analytical and data modeling tools, such as #Python, #R, or #PyCharm Inquisitive, curious, and bias towards learning and continuous improvement Ability to grasp quickly and get up to speed on instrumentation, #clickstreamdata, #Dataarchitecture, and complex business system interactions such as #Salesforce, #AmazonConnect, #EWFM preferred Reach out to me chanikya.k@fionasolutions.com DHEE RAJDaniel DSAI TEJASk HameedaJessica IKunal Goud GMohith MalepatiFiona Solutions #LinkedIn #hiring #benchsales #c2c #w2
To view or add a comment, sign in
-
#Hiring for #Data #engineer #c2c #local #12+years Location: Sunnyvale, CA. (Need Local candidates only) on C2C Duration: 12+ Months (Possible Extention) Experience: 12+ Years Job Description: What you'll do: • You will use cutting edge data engineering techniques to create critical datasets and dig into our mammoth scale of data to help unleash the power of data science by imagining, developing, and maintaining data pipelines that our Data Scientists and Analysts can rely on. • You will be responsible for contributing to an orchestration layer of complex data transformations, refining raw data from source into targeted, valuable data assets for consumption in a governed way. • You will partner with Data Scientists, Analysts, other engineers, and business stakeholders to solve complex and exciting challenges so that we can build out capabilities that evolve the marketplace business model while making a positive impact on our customers' and sellers’ lives. • You will design, develop and maintain highly scalable and fault-tolerant real time, near real time and batch data systems/pipelines that process, store, and serve large volumes of data with optimal performance. • You will build business domain knowledge to support the data need for product teams, analytics, data scientists and other data consumers. What you'll bring: • At least 4+ years of experience development of big data technologies/data pipelines • Experience with in big data technologies like Hadoop, Apache Spark (Scala preferred), Apache Hive, or similar frameworks on the cloud (GCP preferred, AWS, Azure etc.) to build batch data pipelines with strong focus on optimization, SLA adherence and fault tolerance. • Experience in writing SQL to analyze, optimize, profile data preferably in BigQuery or SPARK SQL • Strong data modeling skills are necessary for designing a schema that can accommodate the evolution of data sources and facilitate seamless data joins across various datasets. • Strong analytical and problem-solving skills are crucial for identifying and resolving issues that may arise during the data integration and schema evolution process. Nice to have from you: • Experience building complex near real time (NRT) streaming data pipelines using Apache Kafka, Spark streaming, Kafka Connect with a strong focus on stability, scalability and SLA adherence. • Good understanding of REST APIs – working knowledge on Apache Druid, Redis, Elastic search, GraphQL or similar technologies. Understanding of API contracts, building telemetry, stress testing etc. • Exposure in developing reports/dashboards using Looker/Tableau Experience in eCommerce domain preferred.now Email-rashmi@linktms.com
To view or add a comment, sign in
-
#EU #Contract #roles available <!> These are B2B opportunities, for individuals (no 3rd party accepted) who have the legal rights to work within EU. 1) x2 Senior ETL #Engineers - 6 months + extensions. Remote with 1 max 2 travels required to Bulgaria. Needed skills: Python/PySpark & Databricks 2) x1 #Azure Architect & x2 Senior Azure Engineers - 6 + 12 months extension. Remote with 1-day travel to #Sofia, rest is done remote. Wanted skills: Azure Synapse, Data Factory, SQL, #Python 4) x1 Lead #Architect - URGENT, needs to be able to start ASAP. 12 months contract + possible extension. #Remote #work, to lead a large Cloud Transformation (on-prem to on-prem). Must have skills: Azure Architecture ecosystem (other cloud providers are welcome), lead other architects and architectural decision making governance road map + strategy 5) Senior #Data Engineer - 4 months initial contract + 6 months extension. 90% remote & 10% on-site (Bulgaria). Wanted skills: Data Modeling, Spark, Azure Synapse All of the requests are on 100% workload, within CET time zone. If you see something that piques your interest, contact me! #contractingjobs #freelance #opportunities
To view or add a comment, sign in
-
Senior Data Engineer/ Data Scientist at Disney Streaming | Actively Seeking for New Opportunities | Data Engineer | Big Data | Azure | AWS | SQL | ETL | LLM | GenAI | Power BI | Tableau | Pyspark| Snowflake
🔍 Still Searching for Exciting Data Opportunities! 📈 Hello LinkedIn community, I hope this message finds you well. I am actively seeking new career opportunities in Data Analysis and Data Engineering. My expertise in cloud technologies (AWS, GCP) and ETL processes enables me to deliver impactful solutions that enhance stakeholder satisfaction and project efficiency. In my previous roles, I've contributed to data integration, accuracy, and visualization efforts, supporting informed decision-making within organizations. I am passionate about staying updated on emerging data analytics trends through continuous learning, which I believe is crucial for driving organizational success in today's data-driven world. As my current project nears completion, I am proactively exploring new opportunities. I am particularly interested in Contract-to-Contract (C2C) or Contract-to-Hire (C2H) positions that offer both flexibility and the potential for long-term growth. If you are hiring for a Senior Data Engineer and find my skill set aligns with your client's needs, I would greatly appreciate hearing from you regarding relevant openings. I am confident in my ability to deliver exceptional results and add value to your organization. Feel free to connect with me on LinkedIn or email me at meghanac279@gmail.com. I am excited to explore potential opportunities and contribute to your client's success. Thank you for your time and consideration. I look forward to hearing from you soon! Best regards, Meghana 7206639006 #c2cjobs #c2c #c2h #hiring #datajobs #bigdata #recruiters #cloud #kafka #hadoop #aws #spark #hive #etl #s3 #snowflake #linux #python #sql #scala #azure #databricks #ssis #data #pipeline #activelylooking #dataengineering #jobhunt #newjob #jobinusa #connections #layoffs #hiring #opportunity #askmehow #engineering #help #share #c2chiring #c2crequirements #hiringalerts #snowflake #primevendors
To view or add a comment, sign in
-
Hello #jobseekers #AWS/DevOps. #salesforce #DATA ENGINEER #Selenium Online Training & Placement Programme in MNC Companies [Training Fee: Pay after 1 month] & [Placement Fee: Pay after getting the Job] [Total fee Bank Loan Facility Available Form - NBFC] Get Trained.....Get Interviews.....Get Placement... #onlinetraining #onlinecourses #skillsdevelopment #careerdevelopment #jobplacement #careeropportunities #futureofwork #remotelearning #eLearning #awsdevops #Annexit #dataviz #webdevelopment#devops #programminglife #html #cybersecurity #coder #dataanalyst #code #datamining #engineering #linux #cloudcomputing #softwaredeveloper #sql #automation #science #cloudz #softwareengineer #neuralnetworks #datasciencetrainingdaysofcode #bigdataanalytics
To view or add a comment, sign in
-
-
Exciting news! I have been selected for a fellowship with Hiring Our Heroes! Now you might be asking, “OK, how does that help me?” I am all thing data! I am seeking roles in Data Analyst and Data Engineering, but that's not all! I am a Certified Salesforce Administrator, which means I can not only be your Data Analyst, but also your Salesforce guru. And it doesn't end there—I have my Microsoft Azure Fundamental certification! So, I can apply my skills to the infamous “cloud.” Currently, I am only a candidate 🤔, but I can be your fellow! To become a fellow, you can bring me in for an 11-week program, which is PAID for by the fellowship. Think of it as "Try Before You Buy," which we all know we don't get nowadays. I'm located in Washington, D.C., and prefer fully remote or hybrid positions. I would like to express my gratitude to Hiring Our Heroes for this opportunity! Please like, comment, or share this post, and check out my profile! Let me know if you know someone or a company that may be interested! Valerie C. Yessenia Hagewood #militaryspouse, #hiringourheroes #data, #analytics, #dataanalysis, #dataanalyst, #datavisualization
To view or add a comment, sign in
-
-
are you in need of top-notch consultants for Corp 2 Corp Jobs? Our team of Excellent Consultants is ready to meet your staffing needs! 🌟 Check out the available list of candidates in the image attached. For more information, reach out to 614-810-0182 or add my email to your distribution list at martin@narveetech.com. Sure, here is your message with only the hashtag elements:DotNet hashtagSure! Here is your message with only the hashtag Sure! Here is your message with only the hashtag elements: hashtag Regards, Martin Bench Sales Narvee Tech Inc 17440 Dallas Pkwy, Ste# 122, Dallas, TX-75287 Email : martin@narveetech.com Phone : 6148100182 www.narveetech.com (E-verify) LINKDIN: https://lnkd.in/gisRXryV TO UNSUBSCRIBE FROM FUTURE EMAILS CLICK HERE Certainly! Here are just the hashtags: - #LeadSoftwareEngineer - #H1B - #SoftwareEngineering - #TechJobs - #OnsiteJobs - #JavaDeveloper - #BackEndDeveloper - #SeniorDeveloper - #H1B - #TechJobs - #JavaDeveloper - #H1B - #TechJobs - #OnsiteJobs - #NETDeveloper - #DotNet - #H1B - #TechJobs - #OnsiteJobs - #DataEngineer - #BigData - #AWS - #GCP - #Azure - #Hadoop - #DataArchitect - #H1B - #TechJobs - #PowerApps - #Dynamics365 - #SharePoint - #PowerBI - #H1B - #TechJobs - #ImplementationManager - #BusinessAnalyst - #ResourceDevelopment - #ProgrammerAnalyst - #H1B - #RemoteJobs - #BusinessAnalyst - #DataAnalyst - #ClinicalDataManager - #H1B - #TechJobs - #CloudEngineer - #DevOps - #H1B - #TechJobs - #Snowflake - #DataWarehouse - #DataEngineer - #H1B - #RemoteJobs - #Salesforce - #H1B - #CRM - #RemoteJobs - #NETDeveloper - #DotNet - #H1B - #RemoteJobs - #JavaFullStack - #OPT - #TechJobs - #OnsiteJobs - #CloudEngineer - #DevOps - #H1B - #TechJobs - #OnsiteJobs - #Workday - #TechnoFunctional - #Payroll - #Prism - #H1B - #TechJobs - #DataEngineer - #DataAnalyst - #OPT - #TechJobs - #OnsiteJobs - #NETDeveloper - #DotNet - #IOSDeveloper - #H1B - #RemoteJobs - #SDET - #QA - #Automation - #H1B - #RemoteJobs - #BusinessAnalyst - #SharePoint - #H1B - #Cloud - #DevOps - #H1B - #TechJobs - #OnsiteJobs - #JavaDeveloper - #H1B - #TechJobs - #OnsiteJobs - #DataEngineer - #H1B - #OnsiteJobs - #PythonDeveloper - #H1B - #OnsiteJobs
To view or add a comment, sign in
-