How does Modern Treasury share data with customers on 14 platforms, including Athena, BigQuery, and Snowflake? Prequel. https://lnkd.in/ebeEZzbj
Prequel’s Post
More Relevant Posts
-
Confused about the eternal dilemma: Scale Up ⬆️ or Scale Out ➡️, in the world of data? Is your data expanding faster than you can keep up? You're not alone! Dive into the world of Snowflake's genius scaling tactics, designed for seamless operations, rapid query responses, and the fine art of choosing between Scale Up and Scale Out with the help of Virtual Warehouse. Bid adieu to scaling worries and join the journey now! Read the full article here: https://lnkd.in/g5ZzBY_6 #Snowflake #DataWarehouse #VirtualWarehouse #ScaleUp #ScaleOut #MultiCluster #Scaling #QueryPerformance #Concurrency #DataEngineering
To view or add a comment, sign in
-
-
What's new in Snowflake ? Among many new items, here are some noteworthy ones: ✅ Custom Classification for #dataPrivacy ✅ Data metric functions (DMF) for #dataQualiy ✅ Snowflake Cortex Functions for #LLM ✅ Snowflake CLI 2.x Check out this article for details. #snowflake #data #datacloud https://lnkd.in/gvac3PDJ
To view or add a comment, sign in
-
Discover the game-changing benefits of using Snowflake for Business Intelligence in one of our latest blogs! Unlock insights, scalability, and efficiency like never before. Read it here 👇 #snowflakedatacloud #businessintelligence
The Benefits Of Using Snowflake For Business Intelligence
phdata.io
To view or add a comment, sign in
-
Data Analyst | Python | Tableau | MongoDB | SQL | Open to New Opportunities C-DAC: Centre for Development of Advanced Computing
Today, I took another step forward in my data journey by diving into Snowflake. Understanding how Snowflake separates storage and compute, enabling truly scalable and cost-efficient data solutions, was an eye-opener. The ease of managing and analyzing massive amounts of data with such speed and flexibility is truly impressive. I’m thrilled to see how this powerful data warehousing solution can transform data management and analytics in today’s fast-paced world. Looking forward to exploring more features and use cases! If you have any tips or resources on mastering Snowflake, I'd love to hear them. #DataScience #Snowflake #DataWarehouse #LearningJourney #DataAnalytics
To view or add a comment, sign in
-
FinOps…for Data?? The rise of AI means more data, more compute, and more $$$. Farial Shahnaz and I are speaking at the upcoming Snowflake #DataCloudSummit with Capital One Software on how you can apply FinOps principles to understand the value of your data spend and avoid uncomfortable conversations with your CFO. Hope to see you there! https://lnkd.in/eHxrhr_u
Snowflake | Snowflake Summit 24
reg.summit.snowflake.com
To view or add a comment, sign in
-
Looking to optimize cost, while enhancing your data capabilities with Snowflake? Look no further than a customized migration plan, tailored around 7 key areas to get the most value out of your Snowflake platform. As an Elite partner, we offer the right expertise to solution to your specific cloud-data needs. Connect with me to learn more! TEKsystems + Snowflake
Create a Customized Plan for Your Snowflake Data Migration
teksystems.com
To view or add a comment, sign in
-
Enterprise Account Executive Simplify your data strategy. Accelerate your AI strategy. Scale with applications.
Improving economics for our customers has always been a core value at Snowflake. Learn how we helped ESO cut costs by 60%: https://okt.to/nXy7vR
Snowflake's Performance Optimizations Help ESO Reduce Costs by 60%
snowflake.com
To view or add a comment, sign in
-
Snowflake Optimization Tip of the Day #6 ➡ Identify and remove failing workloads. Even in mid-sized data teams, there’s often a chunk of money that’s going towards workloads that are just failing at the end due to various reasons. These workloads get no work done but consume compute credits. Simply identifying failing workloads can help reduce Snowflake costs. The tricky part is to define what exactly would be a failing workload. In the first comment, you’ll find a blog by Sahil Singla, my co-founder that helps define parameters for defining a workload as failing. Remediation is generally finding the user responsible for the workload and asking him/her to fix it or remove it. Apart from saving money, a secondary advantage for data teams is to keep track of what data is relevant for business users and what’s not.
To view or add a comment, sign in
-
-
We expanded our Redshift integration to support query logs. Combined with data from our dbt integration, we can identify cost-optimization opportunities such as: → Identifying tables that refresh far more frequently than necessary and waste computing resources. → Identifying dbt models that keep refreshing but have no downstream users or models. We have already been doing this with our BigQuery customers for some time, and given the popularity of Redshift, it was only natural to extend support for them as well. Read more in our changelog.
Redshift query logs
synq.io
To view or add a comment, sign in