Sharing the updated list of #CONSULTANTS who have been on projects in the past and are now actively looking for new projects. Please reach out to any of our below contacts with your requirements - Dilip - dilip@nam-it.com - Phone +1 732-743-8373 Lisha - lisha@nam-it.com - Phone +1 732-993-7485 John - john@nam-it.com - Phone +1 732-993-5322 #c2crequirements #usrecruitment #jobsearch #javajobs #hadoop #bigdata #automationengineer #uideveloper #datanalytics #cloudcomputing #saphybris #saphana #oracleebs #oraclefinancials #dotnetdevelopers #awssolutionarchitect #saphcm #sharepointdeveloper #informatica #dataengineer #javafullstack #angularjobs #scrummaster #frontenddeveloper #qamanager #recruitment #sapbi #datasciencecareers #devopsjobs #devops #hotlist #benchlist Vinay Mahajan B.J. Venkatesh Abhimanyu Diwaker Shyam Valloornatt Balaji Krishnamoorthy Srikanth M Srini V S Venki R Lisha Akshay Smital Dhavane Divya Shetty Pooja K Praveen R Gowda B J Lokesh Thirupathi S Jawaharlal Nehru G Arvind Sharma K Manjunath MS Chandra M Shilpa Pradeep Abishai c Aravinth C Benjamin Samuel ITServe Alliance Punjabi Chamber of Commerce
NAM Info Inc’s Post
More Relevant Posts
-
Hello #everyone, Greetings! I'm #hiring for our Client's #requirement projects, get in touch if anyone interested for this project in #madrid, #Spain location. #Language #Spanish and #English #contract duration 12+ #months #hybrid based in #Madrid , Spain #Role 1: #Data Architect: Experience in designing, defining and architecturing large data migration A comprehensive understanding of data warehousing and data transformation (extract, transform and load) processes and the supporting technologies such as #Amazon Glue, #EMR, #Azure #Data #Factory (#adf), Data Lake, other analytics products Proven experience in #architecting and #implementing Business Intelligence and Data warehouse platforms, Master data Management, data integration and OLTP database solutions. Designing architecture solutions that are in line with long-term business objectives Excellent problem solving and data modelling skills #Role 2: #Informatica #Developers: Informatica developers to analyse a #business's database storage and #warehousing capabilities and assess the company's data requirements Mapping in Informatica collection´s of source and target objects linked together by a set of transformations (ETL). 5-7+ years of hands on experience with /designing/development of informatica applications. Good hands on experience in below informatica concepts: Performance tuning, Data modeling and data #warehousing concepts, Reusable transformations, Partition techniques Strong database knowledge #Oracle PL/SQL, tuning Expert in writing Oracle procedures, functions, packages and debugging the code. Provide technical support for issues including hands-on #troubleshooting. Informatica -Mappings, #Workflows, Sessions, Reusable components, Parameters and Variable, #Performance tuning #Role 3: #DataMigration Specialist (#Functional understanding) migration strategy, IFF specifications, data migration scope, workstream plan, data migration #mappings Load data into the deployed solution using tools built on #SQL Server Integration Services Excellent skill of understanding the #data models, object and mapping them with target #schema Excellent #ETL and skill with the knowledge of #Informatica and other tolls like Talend etc.. Converting database objects, applying code to #PostgreSQL schema, creating DMS users and modifying DMS tasks. #Migrating data, fixing defects. #AWS DMS SQL,Oracle DB, PostgreSQL DB Informatica -Mappings, Workflows, Sessions, Reusable components, Parameters and #Variable, Performance #tuning #Share with me your updated CV at sonu.kumar@staffinprime.com
To view or add a comment, sign in
-
SQL Transformation in Informatica Cloud is used to call a Stored Procedure or Function, or execute SQL queries midstream in a mapping pipeline. It can be used both as a connected and unconnected transformation. 🌟 Connected SQL Transformation: A Connected SQL transformation is an inline transformation which comes in the flow of the mapping. It can be used to: ✅ Run stored procedures as data flows through the transformation. ✅ Pass parameters to the stored procedure and receive single or multiple output parameters. 🌟 Unconnected SQL Transformation: An Unconnected SQL transformation is not connected to any transformation in a mapping and can be called using :SP expression. An Unconnected SQL transformation provide additional capabilities: ✅ Execute stored procedures before or after a mapping. ✅ Run a stored procedure once during a mapping. ✅ Conditionally run stored procedures using IIF statements.. ✅ Call stored procedures multiple times within a single mapping. 📝 Checkout the complete article on SQL Transformation: https://lnkd.in/g6RVWG-P 🔴 Here is the link to YouTube video on SQL Transformation: https://lnkd.in/giaZekjE #ThinkETL #Informatica #InformaticaCloud #SQLTransformation #DataIntegration #StoredProcedures #ETL #IICS
To view or add a comment, sign in
-
Job Description: "Minimum 4 years of experience on Oracle's Cloud-based analytics platforms including OAC/ADW/ODI and/or FAW. Strong hands-on expertise in OAC including Analytics, Data Visualization, and Semantic Model Development. Very good development experience in OAC-Reports and dashboards using measures, Filters, calculated measures, calculated items etc Must be able to do Report testing process Experience migrating from OBIEE to OAC. Experience migrating between OAC Instances. Very Good Understanding of DatawareHousing Concepts and Data Warehouse modeling. Thorough handson experience on SQL(on any RDBMS Source). Able to troubleshoot report errors and issues on OAC. Hands On knowledge on Building, Analysis and visualizations based on Datasets created using SQL or Excel Data Sources. Good Knowledge on RPD Modeling and Usage of Data modelers on OAC. Able to troubleshoot report errors and issues on OBIEE/OAC and understand the tool limitations for OAC. Should have experience in performance tuning OAC Analysis, this includes analyzing the Explain Plan of the query, tuning the data model as well as making modifications to the tables such as indexing. Should have good knowledge of Coding, Debugging and Design and Documentation. Understanding of the flow of data between ERP and Data Warehouse. Preferable to Model and Build BI Publisher Reports. Any knowledge on PLSQL/ODI/Any ETL Tool would be preferable. Working on Multidimensional sources (like Essbase) is a plus. Any work on OTBI will be a plus. Expertise on the Oracle Analytics Cloud Tool Knowledge on BIApps concepts is preferable. Familiar with Upgrade activities and Issues encountered during Upgrade from OBIEE to OAC. Expertise in SQl/Knowledge of any ETL tool is preferable. Knowledge on FAW (ERP and SCM)/ADW/OAC (Classic, Data Visualization, Semantic Model Development)/ODI is plus. Having knowledge on IOT,Blockchain is preferable. Use feedback and reflection to develop self-awareness, personal strengths and address development areas. Proven track record as an SME in chosen domain. Ability to come up with Client POC/POV for integrating/increasing adoption of emerging Tech. like BlockChain, AI et al with the product platform they are associated with. Mentor Junior resources within the team, conduct KSS and lessons learnt. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review. Status Reporting for the project. Escalation/Risk management. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy and relevance. Additional Information: Mandatory Skills -FAW with BI Experience Nice to have skills -ODI How to Apply: Interested candidates are invited to submit their resume and cover letter to uma@vividtechnosolutions.com.
To view or add a comment, sign in
-
We are #Hiring Title : ETL Developer Location: BELOIT, WI Remote Primary Skills: ETL Rate : $60/hr on C2C Client : Randstad/ABC Supply Job Description Agility ETL Developer Job Description Position: ETL Developer (Data Conversions, SSIS, SQL Server, Python, ERP Expertise) We're seeking a dynamic and detail oriented ETL Developer with hands-on experience in large-scale data conversion projects. This role requires a proactive professional who has led or contributed to two or more recent data migration initiatives, particularly involving Db2 and IBM mainframe systems transitioning into MS SQL Server. The ideal candidate will also have strong ERP expertise, with a comprehensive understanding of ERP data models and concepts, and the ability to efficiently migrate data into a custom ERP system. Requirements: Data Conversion Expertise: Must have led or been significantly involved in data conversion projects in the last two roles, specifically migrating data from Teradata and IBM mainframe systems into SQL Server. SSIS & SQL Server Mastery: Advanced experience in developing and optimizing SSIS packages, alongside proficiency in MS SQL Server with a focus on complex SQL query writing and database performance tuning. Min 8-10 yrs. of SSIS, ETL & SQL Development. Knowledge of MS SQL Server 2012 or later. Experience with SSRS, SSIS, T-SQL; develop SSIS packages. Python Proficiency: Strong Python skills to automate ETL processes and handle intricate data transformations. Data Mapping & Modeling: Demonstrated experience in conducting detailed data mapping and creating precise data models for migration. Strong Data analysis and data migration script creation experience. Data Flow Diagrams: Ability to craft clear and effective data flow diagrams to illustrate migration and transformation processes. Technical Documentation: Proven track record of writing clear, comprehensive technical design documents that communicate complex technical solutions to both technical and non-technical audiences. ERP Data Expertise: Extensive experience working with ERP data models, with a deep understanding of core ERP entities like customers, suppliers, items, and transactions, and the ability to load data into custom ERP systems. Collaboration & Communication: Exceptional communication skills with the ability to work directly with business users, architects, and cross-functional teams, ensuring alignment and understanding of data solutions. Additional Skills (Good to Have): NiFi Experience: Knowledge of NiFi for building efficient data flows and processing pipelines. Apache Airflow: Familiarity with Apache Airflow for orchestrating complex ETL workflows and managing data pipelines. Apache Splunk: Experience with Apache Splunk for monitoring, searching, and analyzing large volumes of data. Apache Spark: Familiarity with Apache Spark for large-scale data processing and distributed computing. Share your resumes at jay@cloudinventions.us
To view or add a comment, sign in
-
Actively hiring, share your resumes with archana@allnessjobs.com , Generative AI Developer, Datawarehouse engineer/Architect, Ruby on Rails Developer, SD- WAN Network Engineer.
Hello Professionals, We are #hiring for #BIConsultant in #Hyderabad If you're interested share your resume with archana@allnessjobs.com #BIConsultant #OracleBI #DataAnalytics #OBIEE #OracleDataIntegrator #FusionAnalyticsWarehouse #OAC #OracleCloud #DataVisualization #DataWarehousing #SQL #PLSQL #ETL #FAW #BIApps #OracleAnalyticsCloud #ADW #DataModeling #ReportTesting #OACReports #BIArchitecture #CloudBI #DataPlatforms #PerformanceTuning #DataOptimization #CloudIntegration #Snowflake
To view or add a comment, sign in
-
#Integrating PLSQL and Informatica for #Seamless #Data #Processing and #Transformation In the world of #data management, achieving seamless integration between various tools is key to building robust and efficient data solutions. One such powerful combination is #integrating #PLSQL with #Informatica. By leveraging the strengths of both #PLSQL and #Informatica, we can create a seamless data processing and transformation pipeline. PLSQL’s robust database management capabilities paired with Informatica’s powerful ETL processes allow for the efficient handling of complex data workflows. Here's why this synergy is so impactful: #Efficiency: Combining PLSQL's direct database interactions with Informatica's streamlined ETL processes reduces data handling time and improves overall efficiency. #Scalability: This integration supports large-scale data operations, ensuring that as data grows, the system remains performant and reliable. #Flexibility: Informatica’s flexible mapping and transformation tools, combined with PLSQL’s powerful querying and scripting capabilities, provide a versatile solution for various data scenarios. #Data Quality: Enhanced data validation and transformation techniques ensure high data quality, crucial for accurate analytics and reporting. In my recent project, integrating these tools allowed us to streamline our data workflows, significantly reducing ETL runtime and ensuring data accuracy and consistency. This approach not only enhanced performance but also provided a scalable solution for future data growth. The synergy between PLSQL and Informatica is indeed incredible for crafting robust data solutions. Looking forward to exploring more integration techniques and hearing about your experiences with these tools! #PLSQL #Informatica #DataIntegration #DataProcessing #ETL #DataManagement #TechiePosts #humanresources #hr #jobinterviews #hiringandpromotion #jobalert #nowhiring #job #gethired #jobopening #jobfair #recruiting #jobopening #hiring #joinourteam #jobs #jobhirin #remotework #jobsearch #jobsearching #jobseekers #workingathome #hire #opentowork #jobsearch #hireme #jobhunt #jobseeker #hiring #recruitment #technology #HumanResources #humanresources #jobinterviews #hiringandpromotion #jobsearch #jobseekers #jobopening #workingathome #recruiting #job #hiring #deeplearning #homeoffice #culture #plsql #database #oracle #sqlserver #mssqlserver
To view or add a comment, sign in
-
#Hiring ETL Developer Job Description Position: ETL Developer (Data Conversions, SSIS, SQL Server, Python, ERP Expertise) Requirements: Data Conversion Expertise: Must have led or been significantly involved in data conversion projects in the last two roles, specifically migrating data from Teradata and IBM mainframe systems into SQL Server. SSIS & SQL Server Mastery: Advanced experience in developing and optimizing SSIS packages, alongside proficiency in MS SQL Server with a focus on complex SQL query writing and database performance tuning. Min 8-10 yrs. of SSIS, ETL & SQL Development. Knowledge of MS SQL Server 2012 or later. Experience with SSRS, SSIS, T-SQL; develop SSIS packages. Python Proficiency: Strong Python skills to automate ETL processes and handle intricate data transformations. Data Mapping & Modeling: Demonstrated experience in conducting detailed data mapping and creating precise data models for migration. Strong Data analysis and data migration script creation experience. Data Flow Diagrams: Ability to craft clear and effective data flow diagrams to illustrate migration and transformation processes. ERP Data Expertise: Extensive experience working with ERP data models, with a deep understanding of core ERP entities like customers, suppliers, items, and transactions, and the ability to load data into custom ERP systems. Responsibilities: Design and implementation of SSIS, and ETL based on business requirements. Conduct comprehensive data mapping, data analysis, and data modeling to support ETL processes. Use Python scripting to automate data transformation processes and streamline ETL workflows. Write SQL and T-SQL scripts/statements to analyze and translate legacy ERP data into new ERP platform. Create, debug, and execute T-SQL scripts and stored procedures that match data mapping specifications. Create and maintain detailed data flow diagrams, showcasing the logical flow of data throughout the migration process. Perform quality assurance and database validation on all test and live conversions. Leverage a strong understanding of ERP concepts to ensure seamless data interpretation and migration strategies. Work with ERP data models, including customers, suppliers, items, and open transactions, ensuring data is accurately migrated into custom ERP systems. Creates jobs for batch and real time processing of data from internal and external sources. Additional Skills (Good to Have): NiFi Experience: Knowledge of NiFi for building efficient data flows and processing pipelines. Apache Airflow: Familiarity with Apache Airflow for orchestrating complex ETL workflows and managing data pipelines. Apache Splunk: Experience with Apache Splunk for monitoring, searching, and analyzing large volumes of data. Apache Spark: Familiarity with Apache Spark for large-scale data processing and distributed computing. Interested can share the resume at parash.joshi@prisallc.com
To view or add a comment, sign in
61,856 followers