September 09, 2024

September 09, 2024

Does your organization need a data fabric?

So, while real-time data integration and performing data transformations are key capabilities of data fabrics, their defining capability is in providing centralized, standardized, and governed access to an enterprise’s data sources. “When evaluating data fabrics, it’s essential to understand that they interconnect with various enterprise data sources, ensuring data is readily and rapidly available while maintaining strict data controls,” says Simon Margolis, associate CTO of AI/ML at SADA. “Unlike other data aggregation solutions, a functional data fabric serves as a “one-stop shop” for data distribution across services, simplifying client access, governance, and expert control processes.” Data fabrics thus combine features of other data governance and dataops platforms. They typically offer data cataloging functions so end-users can find and discover the organization’s data sets. Many will help data governance leaders centralize access control while providing data engineers with tools to improve data quality and create master data repositories. Other differentiating capabilities include data security, data privacy functions, and data modeling features.


The Crucial Role of Manual Data Annotation and Labeling in Building Accurate AI Systems

Automatic annotation systems frequently suffer from severe limitations, most notably accuracy. Despite its rapid evolution, AI can still misunderstand context, fail to spot complex patterns, and perpetuate inherent biases in data. For example, an automated annotation system may mislabel an image of a person holding an object because it is unable to handle complicated scenarios or objects that overlap. Similarly, in textual data, automated systems may misread cultural references, idiomatic expressions, or sentiments. ... Manual annotation, on the other hand, uses human expertise to label data, ensuring accuracy, context understanding, and bias reduction. Humans are naturally skilled at understanding ambiguity, context, and making sense of complex patterns that machines may not be able to grasp. This knowledge is critical in applications requiring absolute precision, such as healthcare diagnostics, legal document interpretation, and ethical AI deployment. Manual annotation adds a level of justice that automated procedures typically lack. Human annotators can recognize and mitigate biases in datasets, whether they be racial, gender-based, or cultural.


AI orchestration: Crafting harmony or creating dependency?

In a collaborative relationship, both parties have an equal and complementary role. AI excels at processing enormous amounts of data, pattern recognition and certain types of analysis, while people excel at creativity, emotional intelligence and complex decision-making. In this relationship, the human keeps agency through critically evaluating AI outputs and making final decisions. However, this relationship can easily veer into dependency where we become unable or unwilling to perform tasks without AI help, even for tasks we could previously do independently. As AI outputs have become amazingly human-like and convincing, it is easy to accept them without critical evaluation or understanding, even when knowing the content may be a hallucination — an AI-generated output that appears convincing but is false or misleading. ... As AI continues to advance and become more indistinguishable from human interaction, the distinction between collaboration and dependency becomes increasingly blurred. Or worse, as leading historian Yuval Noah Harari — who is renowned for his works on the history and future of humankind points out — intimacy is a powerful weapon which can then be used to persuade us.


The deflating AI bubble is inevitable — and healthy

Predicting the future is generally a fool’s errand as Nobel Prize winning physicist, Niels Bohr recognized when he stated, “Prediction is very difficult, especially about the future.” This was particularly true in the early 1990s as the Web started to take off. Even internet pioneer and ethernet standard co-inventor Robert Metcalfe was doubtful of the internet’s viability when he predicted it had a 12-month future in 1995. Two years later, he literally ate his words at the 1997 WWW Conference when he blended a printed copy of his prediction with water and drank it. But there comes a point in a new technology when its potential benefits become clear even if the exact shape of its evolution is opaque. ... Many AI deployments and integrations are not revolutionary, however, but add incremental improvements and value to existing products and services. Graphics and presentation software provider Canva, for example, has integrated Google’s Vertex AI to streamline its video editing offering. Canva users can avoid a number of tedious editing steps to create videos in seconds rather than minutes or hours. And WPP, the global marketing services giant, has integrated Anthropic’s Claude AI service into its internal marketing system, WPP Open.


Blockchain And Quantum Computing Are On A Collision Course

Herman warns, “The real danger regarding the future of blockchain is that it’s used to build critical digital infrastructures before this serious security vulnerability has been fully investigated. Imagine a major insurance company putting at great expense all its customers into a blockchain-based network, and then three years later having to rip it all out to install a quantum-secure network, in its place.” Despite the bleak outlook, Herman offers a solution that lies within the very technology posing the threat. Quantum cryptography, particularly quantum random-number generators and quantum-resistant algorithms, could provide the necessary safeguards to protect blockchain networks from quantum attacks. “Quantum random-number generators are already being implemented today by banks, governments, and private cloud carriers. Adding quantum keys to blockchain software, and to all encrypted data, will provide unhackable security against both a classical computer and a quantum computer,” he notes. Moreover, the U.S. National Institute of Standards and Technology (NIST) has stepped in to address the issue by releasing standards for post-quantum cryptography. 


Low-Code Solutions Gain Traction In Banking And Insurance Digital Transformation

“Digital transformation should be focused on quick wins so that organizations can start seeing the ROI much sooner,” he said, noting that digital transformation is not just about adopting new technologies — it’s about fundamentally rethinking how businesses operate and deliver value to their customers. One of the recurring challenges he identified is the issue of onboarding in the banking sector. Despite variations in onboarding times from one bank to another, internal inefficiencies often cause delays. A portion of these delays stems from internal traffic rather than external factors. To address this, Arun MS advocated for a shift toward self-service portals, where customers can take control of processes like document submission. “Engaging customers as stakeholders in the process reduces internal bottlenecks and speeds up the overall timeline for onboarding,” he said. This approach not only enhances operational efficiency but also improves the customer experience, which is essential in an increasingly digital world. However, Arun MS was quick to caution that transferring processes to customers must be done thoughtfully.

Read more here ...

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics