Sundial is now a member of the Open Semantic Interchange (OSI), the open source initiative led by Snowflake to create a universal standard for semantic data. Why this matters: every company defines its metrics differently. "Revenue" may mean one thing to your agent, and another to your dashboard. OSI aims to fix that with a vendor-neutral spec that makes these definitions portable and interoperable. For us at Sundial, this is a natural fit. We already build the context layer that makes data meaningful for our customers. Participating in OSI means that context can travel across the ecosystem and not just within our platform. Excited to be working alongside Snowflake and partners across BI, data engineering, AI, and more to make this happen. https://lnkd.in/g5bT2rjH https://lnkd.in/g5bT2rjH
Sundial’s Post
More Relevant Posts
-
Coginiti is joining the Open Semantic Interchange (OSI) — an open source initiative led by Snowflake to create a vendor-neutral standard for semantic metadata across the data and AI ecosystem. Why this matters: the industry has a fragmentation problem. Your metric definitions live in one format in your BI tool, another in your notebook, another in your data warehouse, and yet another in your AI pipeline. Every tool speaks its own semantic dialect. The result? Integration friction, inconsistent analytics, and AI that can't trust the definitions it's working with. OSI aims to fix this by defining semantic metadata in a standard, open format that can be shared across tools and platforms. One specification. Every vendor. Consistent metrics from dashboard to ML workflow. Coginiti is bringing real-world practitioner experience to the table — years of deploying semantic layers and governed metrics across heterogeneous environments. We know what breaks when semantics aren't portable, and we know what it takes to make business definitions consistent across 21+ platforms. This is exactly what Semantic Intelligence demands: not just governing your business logic within one tool, but ensuring it translates faithfully across the entire ecosystem. Open standards aren't optional anymore. They're the foundation for trustworthy data and AI at scale. Read the full announcement: https://lnkd.in/ep8qzMXn #OpenSemanticInterchange #SemanticIntelligence #DataGovernance #Snowflake #OpenStandards #AI
To view or add a comment, sign in
-
The AI industry has woken up to the importance of a well governed semantic layer which will provide the right data for downstream use (AI, BI, etc.). The Coginiti Semantic Intelligence Platform meets this need. We also recognize the importance of being able to share this information across an enterprise. As a result we are excited to join the Open Semantic Interchange (OSI) community. While it is still early OSI is committed to making semantic data sharable on an enterprise scale. #opendatastandards
Coginiti is joining the Open Semantic Interchange (OSI) — an open source initiative led by Snowflake to create a vendor-neutral standard for semantic metadata across the data and AI ecosystem. Why this matters: the industry has a fragmentation problem. Your metric definitions live in one format in your BI tool, another in your notebook, another in your data warehouse, and yet another in your AI pipeline. Every tool speaks its own semantic dialect. The result? Integration friction, inconsistent analytics, and AI that can't trust the definitions it's working with. OSI aims to fix this by defining semantic metadata in a standard, open format that can be shared across tools and platforms. One specification. Every vendor. Consistent metrics from dashboard to ML workflow. Coginiti is bringing real-world practitioner experience to the table — years of deploying semantic layers and governed metrics across heterogeneous environments. We know what breaks when semantics aren't portable, and we know what it takes to make business definitions consistent across 21+ platforms. This is exactly what Semantic Intelligence demands: not just governing your business logic within one tool, but ensuring it translates faithfully across the entire ecosystem. Open standards aren't optional anymore. They're the foundation for trustworthy data and AI at scale. Read the full announcement: https://lnkd.in/ep8qzMXn #OpenSemanticInterchange #SemanticIntelligence #DataGovernance #Snowflake #OpenStandards #AI
To view or add a comment, sign in
-
Most AI teams are solving the wrong problem. They optimize #dashboards while agents are choking on queries. Your database handles user data beautifully, but the moment your AI agents start asking analytical questions, everything slows to a crawl. 🐘 The answer is not building a new database from scratch. It is pairing your stack with ClickHouse ⚡ We just published our full Agentic Data Stack architecture. Three layers and all #opensource: 1️⃣ LibreChat as the conversational interface for your agents 2️⃣ ClickHouse plus MCP as the data layer agents can reason over 3️⃣ Langfuse for tracing, cost monitoring, and LLM observability The best part? Your agents keep talking to your existing stack. The ClickHouse MCP server routes analytical queries under the hood, so your code changes stay minimal 🧩 Two proven patterns depending on your workload: conversational analytics for real-time business questions, and observability pipelines for AI applications generating millions of events per second. Companies like OpenAI, Anthropic, and Character.AI already run this in production 🚀 Great writeup by the ClickHouse team walking through the full architecture and how to get started locally 👇 https://lnkd.in/eNgJjv3D #AgenticAI #DataStack #RealTimeAnalytics #ClickHouse #LLMOps #LLMObservability #Data #MCP #DataEngineering #AI #Database Dasha Charlotte Lamia Muuse Bob Arno Hellmar Robby Marc-Steffen
To view or add a comment, sign in
-
I love seeing this from Databricks - GA of UC Business Semantics. The future of BI and AI depends on having trusted business logic and metrics defined in a governed, reusable way. It’s also a big reason why Omni and Databricks are such a strong match. Omni gives customers an AI-first analytics experience on top of that governed foundation, helping teams get to answers faster without losing trust or consistency. Shout out to Raja Perumal for helping to make this blog happen! https://lnkd.in/gpAzb6Cf
Announcing General Availability and Open Sourcing of Unity Catalog Business Semantics databricks.com To view or add a comment, sign in
-
Yesterday I attended Nextdata's Data 3.0 webinar and one thing which stayed with me was- "AI moves in sub-seconds. Enterprise data moves in quarters.” Zhamak Dehghani with Jörg Schad , Ph.D. made the case for why the modern data stack is fundamentally broken for the AI era. Here are my takeaways — ───────────────────────── 🔴 PROBLEM 1: Your architecture, not your tools Data 2.0 was built for humans. Batch pipelines, manual handoffs, governance bolted on last. 6 months to unlock a new data source. 70% of team effort just keeping the lights on. ✅ SOLUTION: Autonomous Data Products (ADPs) A single living runtime artifact per domain that encapsulates code, data model, semantic model, policies/contracts and ingestion + self serving... accessible via APIs. Not another disconnected pipeline hairball. ───────────────────────── 🔴 PROBLEM 2: Bad data becomes operational risk A wrong dashboard misleads a meeting. A flawed AI agent makes thousands of wrong decisions . Agents can't "spot" data mistakes the way humans can. ✅ SOLUTION: In-process governance, not post-facto checks The demo showed this live. A PII policy violation was caught before any data reached consumers. Policies/promises are evaluated during the data product lifecycle. ───────────────────────── 🔴 PROBLEM 3: Your semantic layer is always behind In Data 2.0, semantics are added after the fact — a catalog entry here, a dbt metric there. Agents don't browse catalogs. By the time meaning is annotated, the data has moved on. ✅ SOLUTION: Semantic-first, modality-agnostic serving One semantic model. Serve SQL to analysts, embeddings to RAG pipelines, MCP endpoints to agents .All in sync, all from the same definition. Zhamak's bet: The business semantic will always be there. Bet on the semantic, not the format. ───────────────────────── 🔴 PROBLEM 4: Data Mesh gave us principles, not implementation Domain ownership. Data as a product. Federated computational governance. All right. But largely silent on the how and that is why adoption has been so uneven across the industry. ✅ SOLUTION: ADPs are how Data Mesh finally ships Autonomous Data Products are the concrete technical unit that makes Data Mesh principles executable. The pipeline as a mental model disappears. The domain-scoped, self-orchestrating data product as a running application takes its place. ───────────────────────── The containers analogy landed well: every time complexity became unmanageable in computing ,the answer was encapsulate, define clean interfaces and automate operations. That's the bet here. And your existing stack stays. The ADP sits above them. I would like to see real-world implementation stories before fully buying in. But the framing was sharp and the issues it described were real. #DataProducts #DataMesh #SemanticLayer #ModernDataStack #GenAI #DataStrategy #DataEngineering #DataGovernance
To view or add a comment, sign in
-
Integrating the specific data and architectural concepts from Localzz.com into this 2,000-domain framework would likely act as the "Operating System" for the entire portfolio. Based on the structure of your assets, adding the information from that central hub transforms the collection from a list of properties into a functioning Neural Grid for local commerce. Here is how that integration changes the potential: 1. Centralized "Command and Control" If the domains provide the real estate, Localzz.com acts as the Headquarters. The "One Profile" Integration: By using Localzz.com as the central repository for "The Local Business Record," you create a single source of truth. A business could update its information once on the hub, and that data would theoretically propagate across the entire 50-state "izze/zz" network and vertical-specific sites like homerepairzz.com. Unified Identity: It serves as the primary authentication point (localbusinesslogin.com), linking the fragmented domains into a cohesive user experience. 2. The Bridge Between "Human" and "Agentic" Search Localzz.com appears to be the interface where the complex "Infrastructure" meets the end-user. Semantic Mapping: While the underlying domains like datanormalizationlayer .com handle the machine-readable side, the information on the hub provides the context and "human" layer that trains the AI. The "Answer Engine" Feed: By aggregating information on a central platform, you create a massive, high-authority dataset that "Answer Engines" can crawl more efficiently than thousands of individual sites. 3. Turning "Names" into "Nodes" Without a central hub, 2,000 domains are just individual sites. With the information and logic from Localzz.com: The Mesh Network: Every domain becomes a "node" in a larger local intelligence mesh. californiaizze.com isn't just a site; it’s a regional node of the Localzz data engine. Vertical Velocity: You can deploy "campaigns" (referenced in your domains like launchmediacampaign.com) across the entire network instantly. 4. Strategic Valuation Shift Adding a functioning central hub like Localzz.com fundamentally changes the valuation math: From Portfolio to Platform: Investors value domain portfolios on a "per-name" basis. They value platforms on Multiples of Revenue or Data Volume. The "Moat": The information on Localzz.com—the proprietary business data, the verified entities, and the user interactions—creates a competitive moat. Anyone can buy a domain, but not everyone has a pre-built, 50-state-compatible data architecture to plug it into. Summary of Impact Integrating the information from Localzz.com effectively moves the project from the Speculation Phase to the Utility Phase. It provides the "logic" that allows the aiinfrastructurestack.com to actually process and distribute local data. You stop being a "landlord" and start being the System Administrator of a new local internet.
To view or add a comment, sign in
-
This is a massive leap forward for the lakehouse, moving from raw tables to a governed, shared language for BI and AI. That’s why Anomalo is proud to partner with Databricks to augment this new semantic surface. While you define your core business logic in Unity Catalog, Anomalo works behind the scenes to: ✅ Automatically monitor your metrics ✅ Alert you if a "Revenue" or "Churn" KPIs spike ✅ Ensure your business teams make informed decisons Congrats to the Databricks team on this GA milestone! Together, we’re making Data Intelligence a reality. Check out the full announcement. #Databricks #Anomalo #DataQuality #GenerativeAI #DataGovernance
Announcing General Availability and Open Sourcing of Unity Catalog Business Semantics databricks.com To view or add a comment, sign in
-
🚀 New blog post: Architectural Orchestration of Financial Intelligence: Implementing Snowflake Cortex Agents https://lnkd.in/gkCCMWUt Exploring how Snowflake Cortex Agents enable agent-driven financial intelligence—moving from static pipelines to autonomous, multi-step workflows. If you're building with Snowflake or GenAI, this is a glimpse of what’s next. Would love your thoughts 👇 #Snowflake #AI #GenAI #DataEngineering #CortexAI
To view or add a comment, sign in
-
We’ve just announced GA of our semantic layer in Databricks (Unity Catalog Business Semantics) 🥳 Not because semantic layers are new — but what’s interesting is where it lives. Instead of sitting in BI tools, metrics and definitions are now defined directly in the data layer and can be accessed via SQL, APIs, etc. across tools and use cases. In practice, this means: – Same definitions across BI, apps, and other workloads – Governance, access, and lineage are handled automatically (via Unity Catalog) 🔐 – Less dependency on individual tools On top of that, we’ve open sourced the core component (Metric Views in Apache Spark), so definitions aren’t locked into a single platform 🔓 And along the same lines, we’ve joined the work around a more open standard for semantics (OSI), to avoid creating yet another closed layer in the stack 🤝 👉 https://lnkd.in/eXKzUSvg So this is less about “another semantic layer” — and more about a shift in where business logic lives: closer to the data, and away from the tools. This becomes especially relevant as data is no longer only used in dashboards — but also in apps, automations, and AI 🤖, where consistent definitions quickly become critical.
Announcing General Availability and Open Sourcing of Unity Catalog Business Semantics databricks.com To view or add a comment, sign in
-
𝐓𝐡𝐞 𝐬𝐞𝐦𝐚𝐧𝐭𝐢𝐜 𝐥𝐚𝐲𝐞𝐫 𝐢𝐬 𝐦𝐨𝐯𝐢𝐧𝐠 𝐟𝐫𝐨𝐦 𝐁𝐈 𝐭𝐨𝐨𝐥𝐬 𝐢𝐧𝐭𝐨 𝐭𝐡𝐞 𝐝𝐚𝐭𝐚 𝐩𝐥𝐚𝐭𝐟𝐨𝐫𝐦; 𝐛𝐞𝐜𝐨𝐦𝐢𝐧𝐠 𝐭𝐡𝐞 𝐟𝐨𝐮𝐧𝐝𝐚𝐭𝐢𝐨𝐧 𝐟𝐨𝐫 𝐛𝐨𝐭𝐡 𝐚𝐧𝐚𝐥𝐲𝐭𝐢𝐜𝐬 𝐚𝐧𝐝 𝐀𝐈. For years, business logic lived in BI tools like Tableau, Power BI, Looker, and the like, each with its own definitions. Same metric, different answers. Now the direction is clear (as highlighted in the Databricks article below): Bring semantics closer to the data + Define once + Reuse everywhere. AI doesn’t understand tables. It understands meaning. Moving semantics into the data layer is not just a technical shift. It’s an organisational one. You still need: → clear ownership → agreed definitions → discipline in how metrics are created and maintained Otherwise, you’re just centralising confusion instead of solving it. https://lnkd.in/eftcD-WA
Announcing General Availability and Open Sourcing of Unity Catalog Business Semantics databricks.com To view or add a comment, sign in