Hooking Snowflake up to your application? At first I was more than skeptical. Having connected similar data warehousing tools in the past (with “mixed results” being a kind recount of the experience) I had my doubts about how effective a strategy this could truly be. 1 year and two successful deployments of Snowflake to 2 applications later, I can safely say it has been the easiest, most performant, and seamless integration of this kind I have ever managed. To be clear, I am not saying one should replace their application database with Snowflake. MongoDB or Postgres are still top of the stack (pun) when it comes to running the nuts and bolts of your application; but if you have data that you’re bringing in from outside the friendly confines of your application – especially if you are a Snowflake shop already – I would highly recommend an integration if you have the right circumstance(s). Just be careful that you write the appropriate safeguards, throttle the warehouse you have dedicated to the service, and tighten up your queries. It’s still “pay as you go”, so you want to make sure you don’t have runaway costs because of a bug or oversight in your application.
Geoff Baillie’s Post
More Relevant Posts
-
Principal Data Engineer @ Altimetrik | 🌟 Top Data Engineering Voice 🌟| 22K+ Followers | Ex Carelon, ADP, CTS | 2x AZURE & 2x Databricks Certified | SNOWFLAKE | SQL | Informatica | Spark | Bigdata | Databricks | PLSQL
🚀 Excited to share a comprehensive guide on 𝗰𝗼𝗻𝗻𝗲𝗰𝘁𝗶𝗻𝗴 𝘁𝗼 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲 𝗗𝗕 via 𝗦𝗤𝗟 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗲𝗿 or 𝗩𝗦 𝗖𝗼𝗱𝗲! Whether you're a seasoned developer or just diving into the world of databases, these steps will streamline your workflow and empower you to harness the power of Snowflake for your projects. 💻 🔍 𝗦𝗤𝗟 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗲𝗿 𝗖𝗼𝗻𝗻𝗲𝗰𝘁𝗶𝗼𝗻 𝗦𝘁𝗲𝗽𝘀: 1. 𝗗𝗼𝘄𝗻𝗹𝗼𝗮𝗱 𝘁𝗵𝗲 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲 𝗝𝗗𝗕𝗖 𝗱𝗿𝗶𝘃𝗲𝗿: Obtain the Snowflake JDBC driver from the Snowflake website or via Maven. 2. 𝗜𝗻𝘀𝘁𝗮𝗹𝗹 𝗦𝗤𝗟 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗲𝗿: If you haven't already, download and install SQL Developer. 3. 𝗔𝗱𝗱 𝗝𝗗𝗕𝗖 𝗱𝗿𝗶𝘃𝗲𝗿 𝘁𝗼 𝗦𝗤𝗟 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗲𝗿: In SQL Developer, navigate to `Tools` > `Preferences` > `Database` > `Third Party JDBC Drivers`. Add the Snowflake JDBC driver (.jar file) here. 4. 𝗖𝗿𝗲𝗮𝘁𝗲 𝗮 𝗻𝗲𝘄 𝗰𝗼𝗻𝗻𝗲𝗰𝘁𝗶𝗼𝗻: Click on the green plus icon in the Connections tab to create a new connection. Select `Snowflake` as the connection type. 5. 𝗣𝗿𝗼𝘃𝗶𝗱𝗲 𝗰𝗼𝗻𝗻𝗲𝗰𝘁𝗶𝗼𝗻 𝗱𝗲𝘁𝗮𝗶𝗹𝘀: Enter the necessary connection details such as username, password, account name, warehouse, etc. 6. 𝗧𝗲𝘀𝘁 𝘁𝗵𝗲 𝗰𝗼𝗻𝗻𝗲𝗰𝘁𝗶𝗼𝗻: Once the details are entered, test the connection to ensure it's working correctly. 7. 𝗦𝘁𝗮𝗿𝘁 𝗾𝘂𝗲𝗿𝘆𝗶𝗻𝗴: After successfully connecting, you can start writing and executing queries. 🛠️ 𝗩𝗦 𝗖𝗼𝗱𝗲 𝗖𝗼𝗻𝗻𝗲𝗰𝘁𝗶𝗼𝗻 𝗦𝘁𝗲𝗽𝘀: 1. 𝗜𝗻𝘀𝘁𝗮𝗹𝗹 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲 𝗲𝘅𝘁𝗲𝗻𝘀𝗶𝗼𝗻: If there's a Snowflake extension available for VS Code, install it from the VS Code marketplace. 2. 𝗖𝗼𝗻𝗳𝗶𝗴𝘂𝗿𝗲 𝗲𝘅𝘁𝗲𝗻𝘀𝗶𝗼𝗻: Configure the extension with your Snowflake account details, including username, password, account name, warehouse, etc. 3. 𝗖𝗼𝗻𝗻𝗲𝗰𝘁 𝘁𝗼 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲: Once configured, use the extension to connect to Snowflake. 4. 𝗪𝗿𝗶𝘁𝗲 𝗾𝘂𝗲𝗿𝗶𝗲𝘀: With the connection established, you can now write and execute SQL queries directly within VS Code. Empower yourself with the tools to unlock the full potential of Snowflake DB. Happy querying! 🚀 #Snowflake #SQLDeveloper #VSCode #Database #TechTips #DeveloperCommunity
To view or add a comment, sign in
-
-
❄ Did you know Snowflake has duplicated functions? There are for example: - RLIKE & REGEXP - NVL & IFNULL - CAST & TO_<EVERY_TYPE> … and some other different use cases like "::" to also cast data types Even tho their output is exactly the same, there is a reason behind why they are still here. And the answer is…. Compatibility with different SQL dialects! Imagine migration from Oracle (or some other DB) to Snowflake without knowing all of these new & fancy functions. You would need to go through your SQL very carefully (I still recommend to do that) and change most of the Oracle's not-ANSI functions for a Snowflake ones. Which sometimes can be challenging. Snowflake simplify this process with the support of "legacy" functions so there is no need to do that. It can also help with code readability, so you don't need to instantly switch to a brand-new different functions and patterns. Do you know some other examples? #Snowflake #dataengineering
To view or add a comment, sign in
-
-
It’s #TechnicalTuesday again which means Chris Hastie is back with another article focusing on Snowflake external access! This week’s article focuses on how to communicate with the REST API for a Tableau Server/Cloud deployment to refresh extracts in a published datasource/workbook. ⬇ Learn more below. https://lnkd.in/ggTmwNNw
To view or add a comment, sign in
-
Let's talk about 'Integration with Snowflake using SQL API' The Snowflake SQL API is a REST API used to access/update the data in the Snowflake database empowering the developers to build custom applications and integrations. Capabilities that SQL API provides- • Execution of SQL statements (including Procedures) • Fetch results concurrently • Status check / Cancellation of statements executed (each API call results in an execution ID which can be used to check the status/ cancel the execution later) Restriction- The APIs do not support uploading (PUT) data files from a local file system to the server or downloading (GET) from the server to a local file system. To read about how to submit a request, refer- https://lnkd.in/gzG84rtm #snowflake #SQLAPI #dataengineering #dataanalytics #API #development #datawarehouse
To view or add a comment, sign in
-
-
Sometimes the little things go a long way. If you are a #sql developer you've probably had experience below. But, no longer - Snowflake now allows "trailing commas"! Read more from our own Qinyi Ding by following the link below, https://lnkd.in/gnUb33Eu #sqldelight #sqldeveloper #sqlskills #datawarehousing #dataengineering #datacloud
To view or add a comment, sign in
-
-
Experienced Senior Data Engineer | Specialising in Big Data, Spark, Hadoop, Snowflake | Proficient in AWS & Azure | Driving Data Innovation
I am Back with another fascinating update from our ongoing project series. This time, we're diving into the dynamic world of e-commerce, specifically focusing on supply chain enhancements for our client. Here's the scoop on what we're achieving with Azure: Project Spotlight: Streamlining E-commerce Supply Chain - The Challenge: Our esteemed e-commerce client tasked us with creating a robust pipeline to fetch real-time updates on shipped orders. The goal? To empower their business with instant insights into order status, location, and delivery progress. Before we start lets discuss Why This Project Rocks first - Skill Enhancement: This project is a masterclass in API integration, JSON handling in SQL, and Spark. Azure Mastery: We're not just using Azure; we're understanding it inside out, particularly its data handling components. Business Impact: Beyond the tech, this project is set to revolutionize how our client views and manages their supply chain. - The Approach: Using Azure, we've developed a pipeline that sources data through APIs. This setup allows us to capture live updates on orders directly from the warehouse. -Technical Breakthroughs: #API, #JSON, and #SQL Magic Data Journey: Once we fetch the order updates, the data - typically in JSON format - is stored in a database. This is where the magic happens! #SQL : We're employing SQL wizardry to flatten JSON data, making it more digestible for analysis. #Spark of Genius: To add more firepower, we're leveraging Apache Spark, coupled with custom User-Defined Functions (UDFs), to process and transform the data efficiently. -Outcome: Real-Time Insights with #PowerBI Power BI Reporting: The transformed data feeds into Power BI, where we craft comprehensive reports. These reports offer our client a real-time view of every shipped order - their current status, location, and whether they've reached their destination. Project Growth: From Scheduled to Real-Time Phase One: We started with a scheduled pipeline, ensuring reliable and timely data flow. Next Steps: Moving forward, we're gearing up for real-time (or near real-time) data processing, aiming to provide even faster insights. #Azure #EcommerceInnovation #SupplyChainTransformation #DataMagic #ADF
Experienced Senior Data Engineer | Specialising in Big Data, Spark, Hadoop, Snowflake | Proficient in AWS & Azure | Driving Data Innovation
I'm thrilled to announce the start of a special series where I'll be sharing my journey and insights into the dynamic world of Big Data, Cloud Computing, and Data Warehousing. Expect a new post every 1-2 weeks. What's in Store? Each post will kick off with an exploration of project requirements, culminating in the grand finale - where I'll share the final code along with a GitHub link for you to explore. Development - Big data - - I created a Lambda function through the Service Catalog. This function seamlessly transfers data from an S3 bucket into Fact and Dimension tables in PostgreSQL. - I tackled the complexity of a nested JSON file, ensuring its smooth transition from the RawZone S3 bucket to various stages like the Rawzwh Hive table, Curated zone warehouse hive (a favorite among data scientists and teams), then onto Redshift Tables, and ultimately integrating with Salesforce. Debugging( Maintenance ) - Snowflake - I was debugging a procedure that is used to read data from snowflake reporting layer tables using config table and put it into S3 bucket for 3rd party reporting tool. Explanation - Snowflake stored procedure was taking taking date as parameter and going to Config table and check which takes to create files where Flag = Y. Then it starts loops and convert all data in csv file then further load it in S3 bucket. After that it will be used by reporting tools. And my requirement was to enhance this functionality by adding some extra feature like to provide file name, start from and start to date to run it for many dates in one go. Your thoughts, feedback, and experiences are invaluable to this journey. Let's learn and grow together in this series! #bigdata #cloudcomputing #datawarehousing #techinsights #learningtogether
To view or add a comment, sign in
-
-
❄️ This week’s #dataEngineering tip is about Stored procedures in Snowflake. For many folks it might be that strange thing used for transformations years ago. Me Too.🙃 Still remember thousands of PL/SQL packages. 🤯 #Snowflake does support stored procedures. They could be a powerful tool for many tasks. In my latest blog post I’ve tried to create a complete guide, a missing manual for stored procedures in Snowflake. 🔥 You will find out all what you need to know for being proficient stored procedures user, including: ✅ supported languages, ✅ difference between UDF and SP, ✅ difference between owner’s and caller’s rights, ✅ step by step guide how to create a stored procedure in different languages ✅ return types, arguments, optional arguments, ✅ how to use SP for dynamic SQL generation and ton of other examples 🧑💻 Stored procedures could be efficient way to do many administrative tasks in your accounts and I think it’s worth to know how to work with them. 💪 Do you use stored procedures? Link in comments. #dataSuperhero
To view or add a comment, sign in
-
-
Techno-Functional: OTC,PTP,SCM & Insurance Domains Memberships:FIETE,SEFM,SM-IEEE,MBCS,MIET Clouds:Azure,GCP,Databricks Languages:Python,PL/SQL,Shell Warehouse:Snowflake,Oracle,Cassandra Modeling:Erwin ML & AI
Snowflake Administration: A Comprehensive Step-by-Step Guide
Snowflake Administration: A Comprehensive Guide - DZone
dzone.com
To view or add a comment, sign in
-
If you've ever wondered how to seamlessly transfer MS SQL Server data into Snowflake, look no further! Discover how to effortlessly ingest data from your on-prem MSSQL instance into Snowflake, using basic and advanced modes in my latest blog. Don't miss the video walkthrough at the end for a comprehensive guide. Check out the blog here: https://lnkd.in/g7rF4k55
How to Use Matillion's Database Query Component to Ingest Microsoft…
matillion.com
To view or add a comment, sign in
-
If you're an $10MM+ consumer brand and have built your own data stack (Snowflake, BigQuery, Redshift) you need to start thinking about a DBA. Not an old school DBA (ex: SQL Server) but someone that really understands how to optimize. In reviewing the tech stack for a lot of brands I've noticed there is zero optimization and brands are spending $2000+/month more then they should . We're doing a lot of optimization ourselves and likely going to reduce our costs by 15%+ DM 👇 if you want to learn more about some of what we've done to reduce our BigQuery and Snowflake costs
To view or add a comment, sign in