<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	 xmlns:media="http://search.yahoo.com/mrss/" >

<channel>
	<title>Business Intelligence &#8211; Nearshore Software Development Company &#8211; IT Outsourcing Services</title>
	<atom:link href="https://nearshore-it.eu/tag/business-intelligence/feed/" rel="self" type="application/rss+xml" />
	<link>https://nearshore-it.eu</link>
	<description>We are Nearshore Software Development Company with 14years of experience in delivering a large scale IT projects in the areas of PHP, JAVA, .NET, BI and MDM.</description>
	<lastBuildDate>Thu, 02 Apr 2026 11:34:42 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>SAP Business Data Cloud &#038; Databricks: from data fragmentation to enterprise AI</title>
		<link>https://nearshore-it.eu/articles/sap-business-data-cloud-databricks/</link>
					<comments>https://nearshore-it.eu/articles/sap-business-data-cloud-databricks/#respond</comments>
		
		<dc:creator><![CDATA[Piotr]]></dc:creator>
		<pubDate>Wed, 25 Mar 2026 13:05:45 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Technologies]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<category><![CDATA[Databricks]]></category>
		<category><![CDATA[SAP]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=37894</guid>

					<description><![CDATA[Less than 40% of business leaders report high confidence in their own data. In this article, we break down how SAP Business Data Cloud and Databricks address that gap — with insights from SAP, a 50% cost reduction case study from Rolls-Royce, and a practical guide to getting started.]]></description>
										<content:encoded><![CDATA[
<p>More than half of organizations struggle to keep data accurate and consistent. Less than 40% of leaders report high confidence in their own numbers. Yet the pressure to deliver AI-driven insights &#8211; in real time &#8211; has never been higher.</p>



<p>That was the starting point for Inetum&#8217;s webinar <em><a href="https://www.engage.inetum.com/sap-bdc-databricks-webinar-on-demand/" target="_blank" rel="noopener">SAP Business Data Cloud &amp; Databricks: Unlock AI &amp; Data potential</a></em>, which brought together practitioners from Inetum and special guests from SAP and Rolls-Royce. The session featured Jan Tretina (Ecosystem Development Manager, SAP), Sebastian Stefanowski (Databricks Practice Leader, Inetum), Raul Muñoz-Gutierrez (SAP Analytics Business Director, Inetum), and Andrew Lager (Program Manager and Digital Delivery Manager, Civil Digital and IT, Rolls-Royce), hosted by Oleh Hudym (SAP Growth Manager Manager, Inetum).</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#less-than-40%-of-leaders-trust-their-own-data-and-that-gap-stalls-ai">1.  Less than 40% of leaders trust their own data &#8211; and that gap stalls AI</a></li>
                    <li><a href="#what-sap-business-data-cloud-is-and-how-it-closes-the-trust-gap">2.  What SAP Business Data Cloud is &#8211; and how it closes the trust gap</a></li>
                    <li><a href="#migration-bdc-works-with-what-organizations-already-have">3.  Migration: BDC works with what organizations already have</a></li>
                    <li><a href="#data-products-eliminate-the-80%-of-a-project-that-adds-no-value">4.  Data products eliminate the 80% of a project that adds no value</a></li>
                    <li><a href="#sap-databricks-and-standalone-databricks-the-same-engine-different-purpose">5.  SAP Databricks and standalone Databricks: the same engine, different purpose</a></li>
                    <li><a href="#six-years-with-databricks-at-rolls-royce:-50%-cost-reduction-and-ai-for-every-analyst">6.  Six years with Databricks at Rolls-Royce: 50% cost reduction and AI for every analyst</a></li>
                    <li><a href="#what-ai-on-sap-data-actually-looks-like">7.  What AI on SAP data actually looks like</a></li>
                    <li><a href="#flexible-consumption-licensing-that-moves-with-the-organization">8.  Flexible consumption: licensing that moves with the organization</a></li>
                    <li><a href="#what-is-coming-in-sap-business-data-cloud-in-2026">9.  What is coming in SAP Business Data Cloud in 2026</a></li>
                    <li><a href="#how-cfos-should-measure-roi-on-sap-business-data-cloud">10.  How CFOs should measure ROI on SAP Business Data Cloud</a></li>
                    <li><a href="#how-to-start-without-committing-to-a-licence-first">11.  How to start &#8211; without committing to a licence first</a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="less-than-40%-of-leaders-trust-their-own-data-and-that-gap-stalls-ai">Less than 40% of leaders trust their own data &#8211; and that gap stalls AI</h2>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;Less than 40% of leaders report high confidence in their data,&#8221;</em> Jan Tretina said at the opening of the session. <em>&#8220;That gap directly impacts both decision speed and innovation.&#8221;</em></p>
</blockquote>



<p>Three issues sit behind this figure. First, data quality: over half of organizations struggle to keep data accurate and consistent across systems. Second, misalignment between IT and business &#8211; finance teams need insights quickly, but IT landscapes can&#8217;t always deliver at that pace. Third, fragmentation: data spread across multiple systems, bringing it together in real time remains a major pain point.</p>



<p>The business consequence is concrete. Data-oriented organizations &#8211; those that have solved the trust problem &#8211; are, according to SAP&#8217;s analysis,<strong> four times more likely to succeed.</strong></p>



<h2 class="wp-block-heading" id="what-sap-business-data-cloud-is-and-how-it-closes-the-trust-gap">What SAP Business Data Cloud is &#8211; and how it closes the trust gap</h2>



<p>Jan Tretina described SAP Business Data Cloud through a flywheel: AI is only as strong as the data behind it, data is only valuable when it is trusted and accessible, and both require a resilient platform underneath.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;SAP Business Data Cloud is our unified data foundation,&#8221;</em> he said. <em>&#8220;It brings clean, connected, and trusted business data together &#8211; unifying data across SAP and non-SAP systems and making that data immediately usable for AI, analytics, and planning.&#8221;</em></p>
</blockquote>



<p>BDC consolidates SAP data services &#8211; SAP BW, SAP Datasphere, SAP Analytics Cloud, and extension partners including Databricks and Snowflake &#8211; under a single platform. The operational impact Tretina highlighted: instead of maintaining thousands of pipelines, custom integrations, and shadow data, BDC provides one consistent data foundation across all use cases. It embraces non-SAP data, open standards, and a broad partner ecosystem, letting organizations build multi-vendor landscapes without losing control of their data. This architecture makes BDC relevant across sectors &#8211; from retailers to manufacturing to financial services and banking.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img fetchpriority="high" decoding="async" width="1024" height="1024" src="https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1.jpg" alt="concept flywheel bdc c 1" class="wp-image-37917" title="SAP Business Data Cloud &amp; Databricks: from data fragmentation to enterprise AI 1" srcset="https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1.jpg 1024w, https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1-300x300.jpg 300w, https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1-150x150.jpg 150w, https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1-768x768.jpg 768w, https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1-495x495.jpg 495w, https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1-395x395.jpg 395w, https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1-675x675.jpg 675w, https://nearshore-it.eu/wp-content/uploads/2026/03/concept-flywheel-bdc-_c-1-900x900.jpg 900w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
</div>


<div style="height:39px" aria-hidden="true" class="wp-block-spacer"></div>



<h2 class="wp-block-heading" id="migration-bdc-works-with-what-organizations-already-have">Migration: BDC works with what organizations already have</h2>



<p>A recurring concern when evaluating any new platform is the cost of transition &#8211; years of investment in SAP BW, Datasphere, or Analytics Cloud environments that can&#8217;t simply be discarded.</p>



<p>Raul Muñoz-Gutierrez addressed this directly. BDC includes all SAP data services within a single platform and is designed to absorb existing environments, not replace them. For SAP BW, BDC offers a &#8220;lift and shift&#8221; migration path &#8211; the existing environment moves in with data and connections preserved. </p>



<p>For SAP Datasphere or SAP Analytics Cloud, a &#8220;rewiring&#8221; process brings those solutions under the BDC umbrella without rebuilding. <em>&#8220;All these tasks don&#8217;t require any work from the customers,&#8221;</em> Muñoz-Gutierrez confirmed. <em>&#8220;They can be carried out directly by SAP.&#8221;</em></p>



<h2 class="wp-block-heading" id="data-products-eliminate-the-80%-of-a-project-that-adds-no-value">Data products eliminate the 80% of a project that adds no value</h2>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;Do you know the effort and time required in a data project to perform data extraction, loading and transformation, as well as reconciling all the information?&#8221;</em> Raul Muñoz-Gutierrez asked during the session. <em>&#8220;All that work, that normally is the 80% of the project, is partially eliminated through a SAP data product.&#8221;</em></p>
</blockquote>



<p>SAP data products contain the main data from key functional areas &#8211; financial, HR, supply chain &#8211; and they inherit all the data semantics from SAP. This makes them usable not just for reporting, but for AI and machine learning development from day one.</p>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" width="1296" height="641" src="https://nearshore-it.eu/wp-content/uploads/2026/03/data-products-8020-v1-1296x641.jpg" alt="data products 8020 v1" class="wp-image-37927" title="SAP Business Data Cloud &amp; Databricks: from data fragmentation to enterprise AI 2" srcset="https://nearshore-it.eu/wp-content/uploads/2026/03/data-products-8020-v1-1296x641.jpg 1296w, https://nearshore-it.eu/wp-content/uploads/2026/03/data-products-8020-v1-300x148.jpg 300w, https://nearshore-it.eu/wp-content/uploads/2026/03/data-products-8020-v1-768x380.jpg 768w, https://nearshore-it.eu/wp-content/uploads/2026/03/data-products-8020-v1-495x245.jpg 495w, https://nearshore-it.eu/wp-content/uploads/2026/03/data-products-8020-v1-1320x653.jpg 1320w, https://nearshore-it.eu/wp-content/uploads/2026/03/data-products-8020-v1.jpg 1456w" sizes="(max-width: 1296px) 100vw, 1296px" /></figure>
</div>


<div style="height:33px" aria-hidden="true" class="wp-block-spacer"></div>



<p>For CFOs specifically, the Financial Intelligence Package activates intelligent applications on SAP Analytics Cloud while simultaneously triggering data loading from SAP S/4HANA into BDC data products. <em>&#8220;Regularly every 30 minutes, the data is going to travel from our SAP S/4 system to our BDC,&#8221;</em> Muñoz-Gutierrez explained. Treasury management, financial planning, and forecasting become available and AI-ready from the moment of activation &#8211; with the option to build custom AI and machine learning on top.</p>



<h2 class="wp-block-heading" id="sap-databricks-and-standalone-databricks-the-same-engine-different-purpose">SAP Databricks and standalone Databricks: the same engine, different purpose</h2>



<p>If Databricks is already in the organization&#8217;s stack &#8211; or under evaluation &#8211; a natural question arises: how does SAP Databricks within BDC relate to the standard Databricks platform?</p>



<p>Sebastian Stefanowski, whose team works with both, explained the distinction. SAP Databricks is a modified release of the Databricks platform, tightly integrated with BDC &#8211; authentication, authorization, and billing are all managed through SAP, using SAP compute units. The most important aspect of the integration, in Stefanowski&#8217;s words, is a dedicated connector enabling zero-copy data sharing between SAP data products and the Databricks catalog: SAP financial data becomes directly queryable in Databricks without replication or transformation overhead.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;If SAP data is at the heart of your business, and financial data is really central to what you do, then SAP Databricks is definitely worth serious consideration,&#8221;</em> Stefanowski said. <em>&#8220;It would enable most of the useful features of Databricks with, generally speaking, no technological barrier.&#8221;</em></p>
</blockquote>



<p>The trade-off is feature breadth. Organizations with IoT streaming requirements, multi-stage declarative pipeline orchestration (DLT), or a need to manage their own compute clusters will find standalone Databricks richer &#8211; with dedicated connectors for external cloud services and platforms like Salesforce. In SAP Databricks, those broader integrations run through SAP. Stefanowski&#8217;s summary was clear: for general data integration scenarios, standalone Databricks; for organizations where SAP financial data is the core, SAP Databricks removes the barriers.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;What seemed to be impossible a few years back in terms of AI capabilities is now possible with this amazing partnership between SAP and Databricks,&#8221;</em> Oleh Hudym noted during the session.</p>
</blockquote>



<h2 class="wp-block-heading" id="six-years-with-databricks-at-rolls-royce:-50%-cost-reduction-and-ai-for-every-analyst">Six years with Databricks at Rolls-Royce: 50% cost reduction and AI for every analyst</h2>



<p>Andrew Lager has been on the Databricks journey at Rolls-Royce for over six years, including as an early adopter of many of its latest technologies &#8211; and he came to the webinar with numbers.</p>



<p><em>&#8220;We&#8217;ve reduced costs upwards of 50% compared to our previous solution,&#8221;</em> he said, referring to the migration from a legacy warehouse to Databricks Lakehouse for engine health monitoring data. Three factors drove that reduction: cheaper compute via Spark, schema evolution that prevents jobs from breaking when table structures change during migrations, and Unity Catalog &#8211; which centralizes data access across sources so Lager can produce management information for internal stakeholders without involving additional teams.</p>



<!-- CTA: Webinar on demand — mid-article -->
<div style="border-left:4px solid #00978a;background:#f7f8fc;padding:20px 24px;margin:32px 0;border-radius:0 4px 4px 0;">
  <p style="margin:0 0 6px;font-size:11px;letter-spacing:1.5px;text-transform:uppercase;color:#00978a;font-weight:700;">Webinar on demand</p>
  <p style="margin:0 0 10px;font-size:18px;font-weight:700;color:#0d1b2a;line-height:1.3;">The full Rolls-Royce story &#8211; and how to replicate it</p>
  <p style="margin:0 0 16px;font-size:15px;color:#444c5e;line-height:1.6;">Andrew Lager walks through six years of Databricks at Rolls-Royce &#8211; the architectural decisions, the AI features that changed daily operations, and the live Q&amp;A with SAP, Databricks, and Inetum practitioners.</p>
  <a href="https://www.engage.inetum.com/sap-bdc-databricks-webinar-on-demand/" style="display:inline-block;background:#00978a;color:#ffffff;padding:10px 22px;border-radius:4px;font-weight:600;font-size:14px;text-decoration:none;" target="_blank" rel="noopener">Watch the recording →</a>
</div>




<p>On the AI side, his example was equally direct. <em>&#8220;Those activities that would take me days can take me five minutes now,&#8221;</em>  he said, describing Databricks AI Genie &#8211; a feature that lets non-technical users query datasets in natural language instead of SQL. </p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;The biggest benefit, not just for me but for Rolls-Royce in general, has been opening up technical solutions to less technical people.&#8221;</em></p>
</blockquote>



<h2 class="wp-block-heading" id="what-ai-on-sap-data-actually-looks-like">What AI on SAP data actually looks like</h2>



<p>Sebastian Stefanowski described Databricks as a complete environment for AI and machine learning, where SAP BDC data products are exposed as tables that plug directly into data pipelines. He grouped the platform&#8217;s AI and ML capabilities into several practical areas:</p>



<ul class="wp-block-list">
<li><strong>AI Playground</strong> &#8211; run any off-the-shelf or externally sourced model on a serverless runtime; create custom prompts and test hypotheses before committing to a build</li>



<li><strong>Model serving endpoints</strong> &#8211; host and serve custom models at scale</li>



<li><strong>AI agent framework</strong> &#8211; build agentic solutions quickly, with a built-in vector search database for RAG implementations</li>



<li><strong>MLflow</strong> &#8211; manage and monitor the full training lifecycle; compare experiments and select the best-performing model</li>



<li><strong>AutoML</strong> &#8211; Databricks runs parallel experiments across multiple model architectures on your data automatically, then selects the model that best fits your test results</li>



<li><strong>AI functions in SQL</strong> &#8211; call AI models directly from a SQL SELECT statement and store predictions as part of the query output</li>



<li><strong>AI Gateway</strong> &#8211; govern model usage centrally: block unauthorized access, filter sensitive queries, and maintain full oversight of what models run and on what data</li>
</ul>



<p>He then connected those capabilities to the most common use cases for SAP data: cash flow forecasting, prices forecasting, and stock level optimization &#8211; blending SAP financial and operational data with seasonality data, interest rate history, and interest rate predictions to anticipate future costs, prices, and inventory needs. Databricks AutoML fits naturally here, running statistical algorithms in parallel and surfacing the model that best fits historical test data.</p>



<p>Beyond forecasting, LLM integration opens a second category: automatically generating product descriptions from structured SAP product data; running sentiment analysis on customer feedback to understand the reasoning behind customer behavior; and identifying the most promising customers to approach based on that analysis. </p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;All these features can be really, really nicely integrated with SAP and blended with SAP data,&#8221;</em> Stefanowski said. <em>&#8220;I think that&#8217;s something which will bring SAP customers to another level.&#8221;</em></p>
</blockquote>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" width="1296" height="707" src="https://nearshore-it.eu/wp-content/uploads/2026/03/chapter-ai-neural-v1_c-1296x707.jpg" alt="chapter ai neural v1 c" class="wp-image-37930" title="SAP Business Data Cloud &amp; Databricks: from data fragmentation to enterprise AI 3" srcset="https://nearshore-it.eu/wp-content/uploads/2026/03/chapter-ai-neural-v1_c-1296x707.jpg 1296w, https://nearshore-it.eu/wp-content/uploads/2026/03/chapter-ai-neural-v1_c-300x164.jpg 300w, https://nearshore-it.eu/wp-content/uploads/2026/03/chapter-ai-neural-v1_c-768x419.jpg 768w, https://nearshore-it.eu/wp-content/uploads/2026/03/chapter-ai-neural-v1_c-495x270.jpg 495w, https://nearshore-it.eu/wp-content/uploads/2026/03/chapter-ai-neural-v1_c-1320x720.jpg 1320w, https://nearshore-it.eu/wp-content/uploads/2026/03/chapter-ai-neural-v1_c.jpg 1408w" sizes="(max-width: 1296px) 100vw, 1296px" /></figure>
</div>


<div style="height:28px" aria-hidden="true" class="wp-block-spacer"></div>



<h2 class="wp-block-heading" id="flexible-consumption-licensing-that-moves-with-the-organization">Flexible consumption: licensing that moves with the organization</h2>



<p>Raul Muñoz-Gutierrez described the BDC commercial model as one of its differentiators. <em>&#8220;SAP Business Data Cloud offers a unit subscription model that provides customers with flexible subscription pricing,&#8221;</em> he said. <em>&#8220;This allows them to subscribe to the services they currently need and easily modify them in the future.&#8221;</em></p>



<p>The example he gave: an organization starts with SAP BW integrated into BDC. Once that environment has been migrated to SAP Datasphere, the BW capacity is removed and reallocated &#8211; to SAP Databricks or SAP Snowflake, for instance. A separate data services licence layer complements the platform subscriptions, enabling activation of analytical applications with near real-time data, development of custom data products, and external data sharing via BDC Connect &#8211; the service that exposes SAP data to Databricks, Snowflake, and Microsoft Fabric through Delta Sharing.</p>



<h2 class="wp-block-heading" id="what-is-coming-in-sap-business-data-cloud-in-2026">What is coming in SAP Business Data Cloud in 2026</h2>



<p>Jan Tretina outlined SAP&#8217;s roadmap across three areas.</p>



<p><strong>First, deeper openness and connectivity: </strong>BDC Connect is expanding to Databricks, Snowflake, Google Cloud, Microsoft Fabric, and AWS. Bi-directional data sharing with SAP HANA Cloud is also in development &#8211; data flowing both ways, removing silos at the source rather than managing them downstream.</p>



<p><strong>Second, data products:</strong> SAP is expanding coverage across more lines of business and releasing Data Product Studio, which will make modeling SAP and non-SAP data into governed, shareable data products <em>&#8220;much easier than ever before&#8221;</em> &#8211; turning data into <em>&#8220;a true asset others can consume and trust,&#8221;</em> in Tretina&#8217;s words.</p>



<p><strong>Third, AI-native capabilities built directly into the data layer:</strong> a new AI Hub with new models, enhancements to the SAP HANA Cloud Knowledge Graph, and a fully agentic multi-modal database &#8211; including memory for agents. <em>&#8220;This is the database AI was really looking for,&#8221;</em> Tretina said. </p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>&#8220;We are building the data foundation every enterprise will need to power next-generation AI. 2026 will be a breakthrough year for BDC.&#8221;</em></p>
</blockquote>



<h2 class="wp-block-heading" id="how-cfos-should-measure-roi-on-sap-business-data-cloud">How CFOs should measure ROI on SAP Business Data Cloud</h2>



<p>In the Q&amp;A, Jan Tretina identified four areas where BDC generates measurable returns for finance leaders.</p>



<p><strong>Efficiency and proactivity.</strong> Fewer manual reconciliations, reduced data preparation effort, faster close-to-forecast cycles.</p>



<p><strong>Data trust.</strong> Higher data quality and fewer errors &#8211; directly addressing the confidence gap he described at the start of the session.</p>



<p><strong>Agility.</strong> Faster time to insight and quicker scenario iterations.</p>



<p><strong>Business impact.</strong> Improved forecast accuracy and tangible cost savings from AI-driven recommendations.</p>



<p><em>&#8220;CFOs should look at gains in proactivity, trust, and measurable financial outcomes,&#8221;</em> Tretina said. <em>&#8220;With SAP BDC, the ROI is visible both in operational efficiency and in the quality of decisions which can power the business.&#8221;</em></p>



<h2 class="wp-block-heading" id="how-to-start-without-committing-to-a-licence-first">How to start &#8211; without committing to a licence first</h2>



<p>For organizations that want to evaluate BDC before committing to a full rollout, Raul Muñoz-Gutierrez described a concrete entry point. Inetum has three years of experience with SAP Datasphere and related environments, and operates its own SAP Business Data Cloud environment &#8211; available to run proof-of-concept scenarios with customers in a live setting, testing whether a target architecture fits their real needs before any licence investment.</p>



<p><em>&#8220;You don&#8217;t have to invest in a licence right now,&#8221;</em> Muñoz-Gutierrez said. <em>&#8220;You have to know what you want, and we are the best company to accompany you to this final scenario &#8211; because we have SAP Data Specialists and also global data expertise from non-SAP solutions.&#8221;</em></p>



<p>On the Databricks side, Sebastian Stefanowski was unambiguous: <em>&#8220;Databricks is for everyone.&#8221;</em> The platform runs on a pay-per-use model &#8211; if workloads run once a day for an hour, the cost reflects that hour. If the platform is idle, the cost goes to zero. <em>&#8220;The openness and scalability of this platform &#8211; which actually scales down to zero &#8211; that is what should encourage any company who wants to start cloud data processing,&#8221;</em> he said.</p>



<p>The message across the webinar was consistent: <strong>before AI delivers value at scale, organizations need a data foundation that is clean, connected, and trusted.</strong> SAP Business Data Cloud is built to be that foundation. Databricks extends what can be done on top of it. And when the two are combined with the right approach, the gap between fragmented enterprise data and working AI closes faster than most organizations expect.</p>



<hr>

<p><strong>The full session is available on demand.</strong></p>
<p>SAP, Databricks, Rolls-Royce, and Inetum practitioners covered migration strategy, data product architecture, and AI implementation &#8211; including the unedited live Q&amp;A. <a href="https://www.engage.inetum.com/sap-bdc-databricks-webinar-on-demand/" style="color:#00978a;font-weight:600;" target="_blank" rel="noopener">Watch the webinar on demand →</a></p>

<p>To run a proof of concept in Inetum&#8217;s own SAP Business Data Cloud environment &#8211; without upfront licence commitment &#8211; <a href="https://www.engage.inetum.com/data-and-ai-contact/" target="_blank" rel="noopener">talk to our team →</a></p>

]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/sap-business-data-cloud-databricks/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Self-service BI: What it is? Applications, Benefits, Risks, and Future with AI</title>
		<link>https://nearshore-it.eu/articles/self-service-bi/</link>
					<comments>https://nearshore-it.eu/articles/self-service-bi/#respond</comments>
		
		<dc:creator><![CDATA[Marcin Prys]]></dc:creator>
		<pubDate>Wed, 21 Aug 2024 07:53:13 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Technologies]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<category><![CDATA[Data]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=28589</guid>

					<description><![CDATA[Read our article to find out what makes it different from the traditional approach, how to implement SSBI and what are the possible software solutions for your business.]]></description>
										<content:encoded><![CDATA[
<h3 class="wp-block-heading">Introduction to Self-Service Business Intelligence</h3>



<p>Self-Service Business Intelligence (SSBI) is a modern approach that allows end-users to analyze data and generate reports without the need to involve IT specialists or data analysts. It is designed for business team members who do not have advanced technical skills but need access to data to make informed business decisions. Read our article to find out what makes it different from the traditional approach, how to implement SSBI and what are the possible software solutions for your business.</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#Traditional-BI-and-self-service-BI---what-is-the-difference?">1.  Traditional BI and self-service BI &#8211; what is the difference?</a></li>
                    <li><a href="#Benefits-of-Self-Service-BI">2.  Benefits of Self-Service BI</a></li>
                    <li><a href="#Risks-of-Self-Service-Business-Intelligence">3.  Risks of Self-Service Business Intelligence</a></li>
                    <li><a href="#Implementation-of-Self-Service-Business-Intelligence-and-BI-best-practices">4.  Implementation of Self-Service Business Intelligence and BI best practices</a></li>
                    <li><a href="#Examples-of-Self-Service-BI-tools">5.  Examples of Self-Service BI tools </a></li>
                    <li><a href="#The-future-of-AI-powered-BI-and-self-service-BI">6.  The future of AI-powered BI and self-service BI</a></li>
                    <li><a href="#BI-strategy-for-enhanced-analytics">7.  BI strategy for enhanced analytics</a></li>
                    <li><a href="#Summary">8.  Summary </a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="Traditional-BI-and-self-service-BI---what-is-the-difference?">Traditional BI and self-service BI &#8211; what is the difference?</h2>



<p>The traditional approach in reporting means that users order reports that the IT department creates. It&#8217;s engaging and quite time-consuming, especially if reports are needed right away.</p>



<p>Modern BI solutions allow you to gain insights from a variety of data sources, and access data for self-service analytics. Using self-service is a convenient way to ease the burden on business analyst departments. However &#8211; I advise against being over-enthusiastic. It&#8217;s worth bearing in mind that a self-service BI tool doesn&#8217;t do everything by itself. To prepare more advanced analyses and reports, you need to involve BI specialists or get training, for example, under the guidance of a solution provider.&nbsp;</p>



<h3 class="wp-block-heading">What is Self-Service BI?</h3>



<p>Self-Service Business Intelligence is a set of tools and technologies that enable business users to independently collect, process, and analyze data. With SSBI, one can create their own reports, dashboards, and data visualizations, which speeds up the decision-making process. These tools are typically user-friendly and offer intuitive interfaces, minimizing the need for technical support.&nbsp;</p>



<p><strong>Also read: </strong></p>



<ul class="wp-block-list">
<li><a href="https://nearshore-it.eu/articles/what-is-data-quality/">Data Quality Importance in Data Governance</a></li>



<li><a href="https://nearshore-it.eu/articles/data-driven-decision-making/" data-type="link" data-id="https://nearshore-it.eu/articles/data-driven-decision-making/">Data-Driven Decision Making</a></li>



<li><a href="https://nearshore-it.eu/articles/data-mesh-vs-data-lake/">Data Mesh vs Data Lake</a></li>
</ul>



<h3 class="wp-block-heading">Applications of Self-Service Business Intelligence</h3>



<p>SSBI is used in many fields, such as:</p>



<ul class="wp-block-list">
<li><strong>Marketing:</strong> Analyzing marketing campaigns, customer segmentation, monitoring ROI.</li>



<li><strong>Finance:</strong> Cost analysis, revenue forecasting, monitoring financial performance.</li>



<li><strong>Sales:</strong> Monitoring sales results, analyzing customer behaviors, predicting market trends.</li>



<li><strong>Operations:</strong> Optimizing processes, monitoring operational performance, analyzing supply chains.</li>



<li><strong>HR:</strong> Analyzing personal data, monitoring employment metrics, assessing employee performance.</li>
</ul>



<h2 class="wp-block-heading" id="Benefits-of-Self-Service-BI">Benefits of Self-Service BI</h2>



<p>SSBI offers many benefits, including:</p>



<ul class="wp-block-list">
<li><strong>Faster decision-making:</strong> Users have immediate access to the necessary data, allowing for quicker decisions.</li>



<li><strong>Cost reduction:</strong> Less involvement of IT teams in creating reports and analyzing data.</li>



<li><strong>Greater flexibility:</strong> Users can customize reports and analyses to meet their individual needs.</li>



<li><strong>Improved data quality:</strong> Users can continuously verify and correct data, which leads to higher-quality information.</li>



<li><strong>Increased innovation:</strong> Access to SSBI tools stimulates creativity and innovation in data analysis.</li>
</ul>



<h2 class="wp-block-heading" id="Risks-of-Self-Service-Business-Intelligence">Risks of Self-Service Business Intelligence</h2>



<p>Despite many benefits, SSBI also involves certain risks:</p>



<ul class="wp-block-list">
<li><strong>Data security:</strong> Lack of proper security measures can lead to data leaks or unauthorized access.</li>



<li><strong>Data quality:</strong> Users may not have the necessary skills to correctly interpret data, leading to erroneous conclusions.</li>



<li><strong>Improper use of tools:</strong> Without proper training, users may use SSBI tools incorrectly.</li>



<li><strong>Information overload:</strong> Excessive data can lead to difficulties in analysis and interpretation.</li>
</ul>



<h2 class="wp-block-heading" id="Implementation-of-Self-Service-Business-Intelligence-and-BI-best-practices">Implementation of Self-Service Business Intelligence and BI best practices</h2>



<p>The implementation process of SSBI requires a thoughtful approach, encompassing several key stages:</p>



<ol class="wp-block-list">
<li><strong>Data preparation:</strong> Identifying and preparing data that will be easy to analyze is crucial. Data must be accurate, consistent, and up-to-date. The best solution is a central data repository that users can access. Popular tools like Power BI can use this prepared data, facilitating analysis.</li>



<li><strong>Selection of SSBI tools:</strong> Many SSBI tools are available on the market, such as Power BI, Tableau, or Qlik. It is important to choose a tool that best meets the organization&#8217;s needs and is easy for end-users to use.</li>



<li><strong>User training:</strong> To effectively use SSBI tools, users need to be trained. Training should cover the basics of using tools, creating reports and dashboards, and data interpretation. This way, they will be able to prepare reports and analyses themselves without waiting for further support.</li>



<li><strong>Monitoring and support:</strong> After implementing the SSBI tool, it is important to monitor its use and provide support to users. Regular data updates, training sessions, and technical support help maintain the high quality and efficiency of SSBI tools</li>
</ol>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2024/06/BigCTA_MarekCzachorowski.jpg" alt="BigCTA MarekCzachorowski" title="Self-service BI: What it is? Applications, Benefits, Risks, and Future with AI 4"></div><div class="tile-content"><p class="entry-title client-name promotion-box__headline2">Elevate Your Data Strategy</p>
<p class="promotion-box__description2">Our customized Data solutions align with your business objectives. Consult with <strong>Marek Czachorowski</strong>, Head of Data and AI Solutions, for expert guidance.</p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Schedule a meeting</a></div></div></div></div>



<h2 class="wp-block-heading" id="Examples-of-Self-Service-BI-tools">Examples of Self-Service BI tools</h2>



<p>Some traditional BI tools include complex BI software that requires specialized knowledge to operate, while self-service BI platforms are designed to be easy to use for non-technical users. These self-service BI tools provide a way for users to access and analyze business data without engaging BI specialists. Below we present the list of popular Self-Service tools for data analytics &#8211; get to know those <a href="https://nearshore-it.eu/articles/5-popular-business-intelligence-tools-2/">BI tools capabilities </a>in detail to choose the best self-service tool.</p>



<h3 class="wp-block-heading">Tableau</h3>



<p>Tableau is a self-service tool with a drag-and-drop function that enables users to easily create and manage visualizations and dashboards. The system is popular thanks to its simple interphase and intuitiveness.</p>



<h3 class="wp-block-heading">Microsoft Power BI</h3>



<p>Microsoft Power BI is integrated with the Azure cloud platform. This solution can be easily handled by anyone. Power BI comes with modern reports and dashboards made available on all devices. Data for reports is collected from various sources to create multidimensional data models, making it possible to conduct analyses in real time.&nbsp;</p>



<h3 class="wp-block-heading">Qlik&nbsp;</h3>



<p>Qlik analytics engine allows users to explore data and easily create sharable reports. Qlik Sense is a powerful platform that, thanks to AI and ML&nbsp; capabilities, allows business users to use the full potential of the data.</p>



<p><strong>Read more:</strong></p>



<ul class="wp-block-list">
<li><a href="https://nearshore-it.eu/articles/xploring-the-power-of-qlik-nprinting/" data-type="link" data-id="https://nearshore-it.eu/articles/xploring-the-power-of-qlik-nprinting/">Exploring the power of Qlik NPrinting</a></li>



<li><a href="https://nearshore-it.eu/articles/qlik-talend-the-future-of-data/">Qlik + Talend – the future of data integration</a></li>
</ul>



<h2 class="wp-block-heading" id="The-future-of-AI-powered-BI-and-self-service-BI">The future of AI-powered BI and self-service BI</h2>



<p>The future of SSBI looks promising, especially in the context of integration with Artificial Intelligence (AI) technologies. AI can significantly improve the functionality of SSBI tools by:</p>



<ul class="wp-block-list">
<li><strong>Automation of analyses:</strong> AI can automatically analyze data and generate reports, further speeding up the decision-making process.</li>



<li><strong>Predictive analyses:</strong> AI can predict future trends and behaviors based on historical data, allowing for better planning and strategy.</li>



<li><strong>Personalization:</strong> AI can adapt SSBI tools to individual user needs, offering personalized ad hoc analysis and reports.</li>



<li><strong>Pattern recognition:</strong> AI can identify hidden patterns and correlations in data that are difficult for humans to detect.</li>
</ul>



<p><strong>Also read:</strong></p>



<ul class="wp-block-list">
<li><a href="https://nearshore-it.eu/articles/what-is-data-quality/" data-type="link" data-id="https://nearshore-it.eu/articles/what-is-data-quality/">What is Data Quality? Data Management Best Practices &amp; More</a></li>



<li><a href="https://nearshore-it.eu/articles/data-mesh-vs-data-fabric-a-guide-to-better-data-management/">Data Mesh vs Data Fabric</a></li>



<li><a href="https://nearshore-it.eu/technologies/data-warehouse-architecture/">Data Warehouses management</a></li>
</ul>



<h2 class="wp-block-heading" id="BI-strategy-for-enhanced-analytics">BI strategy for enhanced analytics</h2>



<p>Choosing the SSBI tool alone is not enough. To become <a href="https://nearshore-it.eu/articles/data-driven-managment/">a data-driven organization</a>, you need a well-thought-out strategy. Which data will be analyzed? This is where the Master Data Management approach will help. Who will be responsible for the data project? Are you going to conduct it alone or in partnership with a Data Management services provider?&nbsp;</p>



<p>Implementation of the self-service bi strategy should include the following steps:&nbsp;</p>



<p>&#8211; <strong>Selecting a Chief Data Officer (CDO)</strong> to support the initiative. &nbsp;<br>&#8211; <strong>Choosing a platform</strong> &#8211; if you are not sure which tool will meet your expectations, decide to work with a partner company.&nbsp;<br>&#8211;<strong> Engaging users</strong> &#8211; representatives of different departments should support the implementation plan and be involved at every stage of the project.&nbsp;<br>&#8211; <strong>Creating a Business Intelligence </strong>team. Typically, a BI team consists of:<br>small teams: analyst, developer, tester. Larger project teams: Release manager, administrator, team manager.&nbsp;&nbsp;</p>



<h2 class="wp-block-heading" id="Summary">&nbsp;Summary</h2>



<p>Self-Service Business Intelligence is a powerful tool that allows end-users to independently collect, process, and analyze data. While SSBI involves certain risks, the benefits of its application are significant. The future of SSBI, combined with AI technologies, seems even more promising, opening new possibilities in data analysis and business decision-making.</p>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2024/06/BigCTA_MarekCzachorowski.jpg" alt="BigCTA MarekCzachorowski" title="Self-service BI: What it is? Applications, Benefits, Risks, and Future with AI 4"></div><div class="tile-content"><p class="entry-title client-name promotion-box__headline2">Elevate Your Data Strategy</p>
<p class="promotion-box__description2">Our customized Data solutions align with your business objectives. Consult with <strong>Marek Czachorowski</strong>, Head of Data and AI Solutions, for expert guidance.</p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Schedule a meeting</a></div></div></div></div>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/self-service-bi/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Exploring the power of Qlik NPrinting </title>
		<link>https://nearshore-it.eu/articles/xploring-the-power-of-qlik-nprinting/</link>
					<comments>https://nearshore-it.eu/articles/xploring-the-power-of-qlik-nprinting/#respond</comments>
		
		<dc:creator><![CDATA[Łukasz Jackiewicz]]></dc:creator>
		<pubDate>Thu, 18 Jul 2024 06:24:21 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Technologies]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<category><![CDATA[Data]]></category>
		<category><![CDATA[Qlik]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=28183</guid>

					<description><![CDATA[ If you are interested in streamlining analytics and reporting with solutions characterized by high scalability, read the article and explore the capabilities of NPrinting for Qlik Sense.  ]]></description>
										<content:encoded><![CDATA[
<p>The generation of readable on-demand reports is quite the challenge nowadays, as users from different domains have access to many data sources. One of the major advantages of the Qlik Sense Business Intelligence solution is Qlik NPrinting, which extends the standard capabilities of BI tools with automated reporting and notifications. If you are interested in streamlining analytics and reporting with solutions characterized by high scalability, read the article and explore the capabilities of NPrinting for Qlik Sense.&nbsp;&nbsp;</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#What-is-Qlik-Sense-Enterprise?">1.  What is Qlik Sense Enterprise?  </a></li>
                    <li><a href="#What-is-Qlik-NPrinting?">2.  What is Qlik NPrinting?  </a></li>
                    <li><a href="#Key-features-and-functionalities-of-Qlik-NPrinting">3.  Key features and functionalities of Qlik NPrinting  </a></li>
                    <li><a href="#Formats-for-Qlik-NPrinting-reports">4.  Formats for Qlik NPrinting reports </a></li>
                    <li><a href="#Benefits-of-Qlik-NPrinting">5.  Benefits of Qlik NPrinting  </a></li>
                    <li><a href="#Qlik-NPrinting-–-sample-Business-Intelligence-use-cases">6.  Qlik NPrinting – sample Business Intelligence use cases </a></li>
                    <li><a href="#Qlik-Sense-and-QlikView-NPrinting-alternatives">7.  Qlik Sense and QlikView NPrinting alternatives </a></li>
            </ol>
</div>


<h2 class="wp-block-heading">What is Qlik Sense Enterprise?</h2>



<p id="What-is-Qlik-Sense-Enterprise?">Qlik Sense is a modern data analytics and visualization platform that allows users to explore, analyze, and share data intuitively and interactively. The primary goal of Business Intelligence tools is to shift the focus from preparing reports (often requiring time-consuming exports from multiple systems, then combining and aggregating them, and finally preparing visualizations) to using this data for daily operations. This enables employees at all levels to make more accurate decisions based on current data rather than intuition.</p>



<p><strong>Also read: </strong></p>



<ul class="wp-block-list">
<li><a href="https://nearshore-it.eu/articles/data-driven-decision-making/">Use data to make informed decisions. Understanding the Importance of Data-Driven Decision Making</a></li>



<li><a href="https://nearshore-it.eu/articles/self-service-bi/">Self-service BI: What it is? Applications, Benefits, Risks, and Future with AI</a></li>



<li><a href="https://nearshore-it.eu/articles/5-popular-business-intelligence-tools-2/">5 of the most popular Business Intelligence tools</a></li>
</ul>



<h2 class="wp-block-heading" id="What-is-Qlik-NPrinting?">What is Qlik NPrinting?</h2>



<p>In a standard Business Intelligence approach, users must log into the platform and open a specific dashboard to analyze the data contained therein.</p>



<p>Qlik NPrinting, however, enables automated reporting that goes beyond this – users do not need to check the dashboard themselves (though this is still an option for a broader context).</p>



<h3 class="wp-block-heading">Understanding the basics of Qlik Sense NPrinting</h3>



<p>Instead, users can receive a predefined report after each data reload or at a fixed time. This represents a further step in simplifying reporting and report distribution – from creating reports &#8220;manually&#8221; without BI tools, to a dashboard where data reloads cyclically but must be opened to view, to Qlik NPrinting, where the report is automatically distributed to a defined group of recipients, e.g., via email or the NewsStand portal. Qlik’s NewsStand platform serves to store up-to-date reports. This portal grants users the ability to preview and download reports and request automated updates.</p>



<h2 class="wp-block-heading" id="Key-features-and-functionalities-of-Qlik-NPrinting">Key features and functionalities of Qlik NPrinting</h2>



<p>Qlik NPrinting is an add-on to Qlik Sense that allows the creation of a report template, which is then distributed to a defined group of recipients. This requires the following steps:</p>



<ol class="wp-block-list">
<li>connecting to the application that will be the data source for the report,</li>



<li>preparing filters (if we want to send a report only from a specific region/data subset),</li>



<li>conditions (if we want the report to be sent only when a specific condition is met),</li>



<li>the report itself.</li>
</ol>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2024/06/BigCTA_MarekCzachorowski.jpg" alt="BigCTA MarekCzachorowski" title="Exploring the power of Qlik NPrinting  6"></div><div class="tile-content"><p class="entry-title client-name promotion-box__headline2">Elevate Your Data Strategy</p>
<p class="promotion-box__description2">Our customized Data solutions align with your business objectives. Consult with <strong>Marek Czachorowski</strong>, Head of Data and AI Solutions, for expert guidance.</p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Schedule a meeting</a></div></div></div></div>



<h2 class="wp-block-heading" id="Formats-for-Qlik-NPrinting-reports">Formats for Qlik NPrinting reports</h2>



<p>Qlik NPrinting enables report generation in several formats:</p>



<ul class="wp-block-list">
<li>Excel</li>



<li>HTML</li>



<li>PowerPoint</li>



<li>Qlik Entity</li>



<li>Word</li>



<li>Pixel Perfect</li>
</ul>



<p>After loading the data and setting the scheduler, the created template is then attached to an email or placed on the NewsStand platform.</p>



<h2 class="wp-block-heading" id="Benefits-of-Qlik-NPrinting">Benefits of Qlik NPrinting</h2>



<h3 class="wp-block-heading">Multiple formats</h3>



<p>Qlik NPrinting offers a wide range of report formats, allowing for easy integration with other business tools tailoring reports to the specific needs of recipients. For example, financial reports can be generated in Excel for further analysis and data manipulation, while presentation reports can be created in PDF format for readability and a professional appearance.</p>



<h3 class="wp-block-heading">Providing the right reports to the right people</h3>



<p>Qlik NPrinting offers extensive report customization capabilities. Users can tailor reports to specific needs using templates and scripts. Personalization allows for the creation of reports that precisely meet the recipients&#8217; requirements, containing only the information relevant to them. For instance, managers can receive reports summarizing key performance indicators (KPIs), while analysts can receive more detailed reports with raw data for further analysis.</p>



<h3 class="wp-block-heading">Great-looking reports with NPrinting reporting capabilities</h3>



<p>Qlik NPrinting enables users to create visually attractive reports based on Qlik data. Such appealing visualizations can be created and distributed effectively in an automated manner.</p>



<p>Not only can a user define an email message using the built-in editor, which allows the use of HTML tags, and add variables directly from Qlik (such as email addresses or a list of recipients that is dynamically updated with each data model refresh), but also, for example, attach an entire chart from the dashboard (as in the Supplier Performance Reporting paragraph). Additionally, they can format the attachment suitably – e.g., an .xlsx file – by appropriately coloring the columns:</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="141" src="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic1.png" alt="NPrinting" class="wp-image-28189" title="Exploring the power of Qlik NPrinting  7" srcset="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic1.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic1-300x56.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic1-495x92.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p>Or by attaching a sheet with data to the output and an additional pivot table that will contain a summary of the data:</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="477" height="295" src="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic2.png" alt="NPrinting" class="wp-image-28191" title="Exploring the power of Qlik NPrinting  8" srcset="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic2.png 477w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic2-300x186.png 300w" sizes="auto, (max-width: 477px) 100vw, 477px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p>Of course, there is nothing to prevent formatting the data to meet the company&#8217;s internal standards regarding report layout (adding an image with the company logo, appropriate font and size, colors, etc.).</p>



<h2 class="wp-block-heading" id="Qlik-NPrinting-–-sample-Business-Intelligence-use-cases">Qlik NPrinting – sample Business Intelligence use cases</h2>



<p>Now that we know the basic capabilities in terms of reporting and distribution, it&#8217;s time to move on to specific applications made possible by the NPrinting engine.</p>



<h3 class="wp-block-heading">Example 1: Automating sales reporting</h3>



<p>Let’s imagine that every week we need to create a sales breakdown for individual salespeople and send such a report to each salesperson. Nothing could be simpler. Using Qlik NPrinting, we connect to the sales dashboard and configure the report with an attachment in the form of an Excel/PDF/HTML file containing the data snapshot. Additionally, we can fully customize the message, attach any graphics (or a chart directly from the dashboard), and also pull the current list of salespeople directly from Qlik using a variable.</p>



<p>The report can also be customized – salespeople from a given region can receive only the data relevant to them. Thus, Qlik NPrinting will generate as many messages as there are defined regions, the same number of attachments, schedule reports, and send them accordingly. It is as simple as that. This way, we eliminate the need for the manual preparation of weekly summaries and distribution to a specific group of recipients.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="606" src="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic3.png" alt="NPrinting" class="wp-image-28193" title="Exploring the power of Qlik NPrinting  9" srcset="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic3.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic3-300x240.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic3-493x395.png 493w" sizes="auto, (max-width: 756px) 100vw, 756px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h3 class="wp-block-heading">Example 2: Automatic notification system</h3>



<p>Another interesting use case for Qlik NPrinting is an automatic notification system. Let’s now imagine that we have a dashboard collecting data on the current state of production, refreshed every hour. If the employee responsible for production metrics had to check it every hour, it would be inefficient and bear the risk of error. This is another excellent example of the application of Qlik NPrinting, saving time by creating a notification that will be generated only if a predefined metric drops below a certain level. In our example, the user responsible for monitoring the metrics will get an automatic notification as soon as one of the metrics falls below the set threshold.</p>



<p>Such a solution optimizes the employee&#8217;s time and, most importantly, shortens the time between the event, reading data from the dashboard, and taking action. It also reduces the risk of missing an event, as we are notified each time it occurs (and the recipient group can be broader than one person).</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="257" src="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic4.png" alt="NPrinting" class="wp-image-28195" title="Exploring the power of Qlik NPrinting  10" srcset="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic4.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic4-300x102.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic4-495x168.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h3 class="wp-block-heading">Example 3: Supplier performance reporting</h3>



<p>Qlik NPrinting can also be used to distribute content outside the organization. Let’s say that as a manufacturing company, we evaluate our suppliers quarterly in an automated manner (based on data on delivery timeliness, quality, packaging errors, etc.). Without Qlik NPrinting, it would be necessary to manually export this data from the dashboard or have an external IT system that would pull it from the ERP system, calculate it, and send it to recipients. However, using the power of this Qlik Sense add-on, it is possible to quickly create a report, define filters, and automatically inform suppliers of their results each quarter.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="512" src="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic5.png" alt="NPrinting" class="wp-image-28197" title="Exploring the power of Qlik NPrinting  11" srcset="https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic5.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic5-300x203.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/07/nearshore_2024.07.16_graphic5-495x335.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption"><em>With NPrinting, we get great-looking reports in an automated manner&nbsp;</em>&nbsp;</figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h2 class="wp-block-heading" id="Qlik-Sense-and-QlikView-NPrinting-alternatives">Qlik Sense and QlikView NPrinting alternatives</h2>



<p>Qlik NPrinting is not the only tool that extends the functionality of Qlik. One of the major advantages of the Qlik Sense environment is its ability to allow the community to create extensions and add-ons, making it possible to find alternatives for every solution – and this is no exception.</p>



<h3 class="wp-block-heading">Mail &amp; Deploy</h3>



<p>One of the alternative solutions that can compete with Qlik NPrinting is Mail &amp; Deploy, which was created by members of the Qlik community and later commercialized. This add-on has similar options to the product endorsed by Qlik, allowing for the creation of reports in various formats, sending notifications, and managing them in an administrative panel.</p>



<h3 class="wp-block-heading">Qlik Cloud</h3>



<p>When it comes to Qlik Cloud, the equivalent of Qlik NPrinting is Qlik Reporting. This tool is the successor to NPrinting. In the case of migration from an on-premises version to the cloud, it is possible to import reports from NPrinting, adapted to function in a cloud environment.Also read: Qlik Talend</p>



<h2 class="wp-block-heading">Summary</h2>



<p>Qlik NPrinting is a powerful tool that brings significant benefits to companies through advanced report generation capabilities, process automation, and flexible distribution options. Thanks to integration with the QlikView and Qlik Sense platforms, Qlik NPrinting enables the creation of accurate, personalized reports that support decision-making processes and increase operational efficiency.</p>


</style><div class="promotion-box promotion-box--image-left promotion-box--full-width-without-image"><div class="tiles latest-news-once"><div class="tile"><div class="tile-content"><p class="promotion-box__description2"><strong>Consult your project directly with a specialist</strong></p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Book a meeting</a></div></div></div></div>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/xploring-the-power-of-qlik-nprinting/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Data Mesh vs Data Fabric: Their Purpose, Similarities, Differences &#038; How They Work Together </title>
		<link>https://nearshore-it.eu/articles/data-mesh-vs-data-fabric-a-guide-to-better-data-management/</link>
					<comments>https://nearshore-it.eu/articles/data-mesh-vs-data-fabric-a-guide-to-better-data-management/#respond</comments>
		
		<dc:creator><![CDATA[-- Nie pokazuj autora --]]></dc:creator>
		<pubDate>Fri, 05 Jul 2024 13:25:13 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Technologies]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=28094</guid>

					<description><![CDATA[In today's digital world, data is everywhere. Companies of all shapes and sizes are swimming in data, but simply having various data products isn't enough to create a competitive advantage. Instead, you need to enable teams to use and share data in a way that's consistent, powerful, and insightful. ]]></description>
										<content:encoded><![CDATA[
<p>Say hello to Data Mesh and Data Fabric. These are not just fancy tech buzzwords, but game-changing approaches to data architecture, transforming how organisations handle, understand, and use data. But what exactly are they? How do they work? And most importantly, how can they help businesses drive better results?&nbsp;</p>



<p>In this article, we&#8217;ll explore what a Data Mesh and a Data Fabric technologies are, their similarities and differences, and how they can work together to turbocharge your entire data management approach.&nbsp;</p>



<p>Let&#8217;s get started!&nbsp;</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#What-is-a-Data-Mesh?-Decentralized-Data-&#038;-More">1.  What is a Data Mesh? Decentralized Data &#038; More </a></li>
                    <li><a href="#The-Advantages-&#038;-Disadvantages-of-a-Data-Mesh">2.  The Advantages &#038; Disadvantages of a Data Mesh </a></li>
                    <li><a href="#What-is-a-Data-Fabric?-Data-Quality-&#038;-Consistency">3.  What is a Data Fabric? Data Quality &#038; Consistency </a></li>
                    <li><a href="#The-Advantages-&#038;-Disadvantages-of-a-Data-Fabric">4.  The Advantages &#038; Disadvantages of a Data Fabric </a></li>
                    <li><a href="#How-Do-Data-Mesh-and-Data-Fabric-Technologies-Work-together-to-Improve-Data-Management?">5.  How Do Data Mesh and Data Fabric Technologies Work together to Improve Data Management? </a></li>
                    <li><a href="#Where-Does-Data-Governance,-Data-Lakes-and-Data-Warehouses-Fit-In?">6.  Where Does Data Governance, Data Lakes and Data Warehouses Fit In? </a></li>
                    <li><a href="#The-Challenges-and-Considerations-of-Implementing-a-Data-Fabric-and-Data-Mesh">7.  The Challenges and Considerations of Implementing a Data Fabric and Data Mesh </a></li>
                    <li><a href="#Realise-The-Benefits-of-Data-Mesh-&#038;-Data-Fabric-by-Working-With-an-Expert-Partner">8.  Realise The Benefits of Data Mesh &#038; Data Fabric by Working With an Expert Partner </a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="What-is-a-Data-Mesh?-Decentralized-Data-&amp;-More">What is a Data Mesh? Decentralized Data &amp; More&nbsp;</h2>



<p>A<a href="https://nearshore-it.eu/articles/unveiling-data-mesh-architecture-principles/" target="_blank" rel="noreferrer noopener"> Data Mesh</a> is a way of organizing and using data in big companies. Instead of having one central team and system that handles all the data, a Data Mesh splits up the responsibility among the teams or departments who are closest to the data source. For example, this means HR teams owning HR data, and Sales teams managing Sales data.&nbsp;&nbsp;</p>



<p>Think of it like a family dinner. Rather than one person cooking the entire meal, each family member cooks the part they&#8217;re best at. This way, everyone gets to share the meal, with each part being extra delicious as it was created by the expert.&nbsp;</p>



<p>Here&#8217;s a breakdown of some key principles in the data mesh approach, and how they work in practice:&nbsp;</p>



<ol class="wp-block-list" start="1">
<li><strong>Data as a Product:</strong> Each team treats its data like a product that they offer to the rest of the company. They ensure data is high-quality, easy to use, and valuable for others.&nbsp;</li>



<li><strong>Domain Ownership:</strong> The teams that create or use the data the most are responsible for managing it. After all, they know their data best.&nbsp;</li>



<li><strong>Self-Serve Data Platform:</strong> Each domain manages their data in a system that enables sharing, making it easy for other teams to access and use the data.&nbsp;</li>
</ol>



<p>In our family dinner analogy, as well as creating the dish, each family member also shares the recipe and answers questions about it. This way, everyone gets to enjoy the meal and benefit from new insights and information.&nbsp;&nbsp;</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>Expert insight</em>&nbsp;<br><em>Piotr Rembowski, Principal Data Engineer at Inetum</em>&nbsp;</p>



<p></p>
<cite><em>&#8220;In the world of Big Data, it is necessary to choose the right architecture to allow for effective data management, scaling, and resource optimization. In recent years, Data Mesh architecture has become one of the leading concepts for Big Data&#8221;.&nbsp;</em>&nbsp;</cite></blockquote>



<h2 class="wp-block-heading" id="The-Advantages-&amp;-Disadvantages-of-a-Data-Mesh">The Advantages &amp; Disadvantages of a Data Mesh&nbsp;</h2>



<p>Creating a data mesh architecture takes time and effort, with many organizations needing to move from a centralized data model, to a decentralized alternative.&nbsp;&nbsp;</p>



<p>Let&#8217;s have a look at some of the advantages and disadvantages of a data mesh.&nbsp;&nbsp;</p>



<p><strong>Advantages:</strong>&nbsp;</p>



<ol class="wp-block-list" start="1">
<li><strong>Speed and Flexibility:</strong> Because different business teams can work on their data domain independently, they can move quickly and adapt to changes without waiting for a central data team.&nbsp;</li>



<li><strong>Better Quality Data:</strong> The teams that know the data the best can ensure it is accurate and useful, leading to additional benefits of data quality.&nbsp;</li>



<li><strong>Increased Innovation:</strong> When teams have control over their data, they&#8217;re more likely to come up with creative ways to use it.&nbsp;</li>



<li><strong>Scalability:</strong> As the company grows, the Data Mesh can easily grow with it by adding new teams and domains.&nbsp;</li>



<li><strong>Improved Data Literacy: </strong>When more people in the company work directly with data, it breaks down data silos and improves overall understanding and use of data improves.&nbsp;</li>
</ol>



<p><strong>Disadvantages:</strong>&nbsp;</p>



<ol class="wp-block-list" start="1">
<li><strong>Complexity:</strong> Setting up a Data Mesh isn&#8217;t easy. It requires a big shift in how people think about and work with data, as well as additional effort from experts such as data scientists and enterprise data architects.&nbsp;</li>



<li><strong>Potential for Inconsistency: </strong>With different teams handling data differently, there&#8217;s a risk of inconsistencies across data storage, data security, and data virtualization processes.&nbsp;</li>



<li><strong>Need for New Skills: </strong>Teams need to learn new skills to manage their data effectively, which can be challenging and time-consuming.&nbsp;</li>



<li><strong>Initial Slowdown:</strong> Implementing a Data Mesh can slow things down at first as everyone adjusts to the new approach to data management.&nbsp;</li>



<li><strong>Governance Challenges:</strong> Balancing centralized rules with team autonomy can be tricky and may lead to conflicts.&nbsp;</li>
</ol>



<h2 class="wp-block-heading" id="What-is-a-Data-Fabric?-Data-Quality-&amp;-Consistency">What is a Data Fabric? Data Quality &amp; Consistency&nbsp;</h2>



<p>A Data Fabric is an architecture that weaves together different data sources, types, and locations into a unified capability. It&#8217;s like an intelligent, invisible layer that sits on top of all your data, making it easier to access, manage, and use, regardless of where it&#8217;s stored or what format it&#8217;s in.&nbsp;</p>



<p>Here&#8217;s how a data fabric architecture brings everything together:&nbsp;</p>



<ol class="wp-block-list" start="1">
<li><strong>Data integration:</strong> It connects all the data sources together, whether they&#8217;re in the cloud, on your computer, or somewhere in between.&nbsp;</li>



<li><strong>Data Automation:</strong> It picks up the work of moving and preparing data, using smart algorithms to quickly manage the complexity of data technologies.&nbsp;</li>



<li><strong>Data Discovery:</strong> It helps you find the data you need, even if you&#8217;re not sure where it&#8217;s stored &#8211; essentially creating a huge data catalog.&nbsp;</li>



<li><strong>Data Governance:</strong> It sets the governance standards, and keeps track of who&#8217;s using what data and makes sure everyone follows the rules.</li>



<li><strong>AI and Machine Learning:</strong> These technologies help the Data Fabric learn and improve over time, making data management smarter and more efficient.&nbsp;</li>
</ol>



<p>You can think of a data fabric as a super-librarian who not only knows where every book in the library is but can also instantly translate them, combine information from different books, and even suggest books you might find useful – all in the blink of an eye.&nbsp;</p>



<p>Also read:&nbsp;</p>



<ul class="wp-block-list">
<li><a href="https://nearshore-it.eu/articles/technologies/getting-started-with-homelab/" target="_blank" rel="noreferrer noopener">Getting started with Homelab&nbsp;</a>&nbsp;</li>



<li><a href="https://nearshore-it.eu/articles/what-is-data-quality/" target="_blank" rel="noreferrer noopener">Data Quality&nbsp;</a>&nbsp;</li>
</ul>



<h2 class="wp-block-heading" id="The-Advantages-&amp;-Disadvantages-of-a-Data-Fabric">The Advantages &amp; Disadvantages of a Data Fabric&nbsp;</h2>



<p>Like a Data Mesh, creating a Data Fabric is no small task. Some of the advantages and disadvantages of implementing a data fabric include:&nbsp;</p>



<p><strong>Advantages:</strong>&nbsp;</p>



<ol class="wp-block-list">
<li>Unified Data Access: Regardless of where your data pipelines or data warehouses are stored, a Data Fabric makes it easy to find and use data.&nbsp;</li>



<li>Improved Data Quality: With built-in data governance and quality checks, you can trust the data you&#8217;re using.&nbsp;</li>



<li>Faster Insights: Automation and smart data management bring together various data sources to enable users to get answers faster.&nbsp;</li>



<li>Flexibility: A Data Fabric approach creates flexibility, as it can adapt to new data sources and technologies, future-proofing your data infrastructure.&nbsp;</li>



<li>Reduced Complexity: While the technology itself is complex, it simplifies data architecture across the board, making life easier for data consumers.&nbsp;</li>
</ol>



<p><strong>Disadvantages:&nbsp;</strong></p>



<ol class="wp-block-list">
<li>High Initial Cost: Implementing a Data Fabric can be expensive, requiring significant investment in technology and inputs from data engineering experts.&nbsp;</li>



<li>Potential for Over-Reliance: There&#8217;s a risk businesses can become too dependent on the Data Fabric, reducing data skills and competencies within the organisation.&nbsp;</li>



<li>Security Concerns: With data from multiple sources flowing more freely, ensuring security and privacy can be more challenging.&nbsp;</li>



<li>Change Management: Adopting a Data Fabric often requires significant changes in how people work with data, which can be met with resistance.</li>
</ol>



<h2 class="wp-block-heading" id="How-Do-Data-Mesh-and-Data-Fabric-Technologies-Work-together-to-Improve-Data-Management?">How Do Data Mesh and Data Fabric Technologies Work together to Improve Data Management?&nbsp;</h2>



<p>Now, you might be thinking, “If Data Mesh and Data Fabric are both about managing data, do I have to choose between them?”&nbsp;&nbsp;</p>



<p>The good news is, you don&#8217;t! As we&#8217;ve seen from the descriptions, data mesh focuses on the structure and location of data, where a data fabric enables organizations to bring data together quickly and consistently.&nbsp;</p>



<p>Here&#8217;s how the two complement each other to drive data management transformations:&nbsp;</p>



<ol class="wp-block-list" start="1">
<li><strong>Structure: </strong>Data Mesh provides the structure of the data, with the Data Fabric determining the standards and rules each domain should apply.&nbsp;</li>



<li><strong>Data Discovery: </strong>Data Mesh aligns data with its rightful owner, with a Data Fabric there to bring different data across the business together.&nbsp;</li>



<li><strong>Centralization vs Decentralization:</strong> Data Mesh promotes a decentralized approach to data ownership, while Data Fabric provides a centralized approach to data ingestion to access and manage this decentralized data.&nbsp;</li>



<li><strong>Data Quality:</strong> Data Mesh puts the responsibility for data quality in the hands of domain experts, while Data Fabric provides the tools to enforce and monitor data quality across the organization.&nbsp;</li>



<li><strong>Self-Service: </strong>Data Mesh promotes a self-service culture, and Data Fabric solutions provide the technical capabilities to make self-service a reality.&nbsp;</li>



<li><strong>Scalability:</strong> As the organization grows and data becomes more complex, Data Mesh provides a scalable organizational model, while Data Fabric offers a scalable technical solution.&nbsp;</li>
</ol>



<h2 class="wp-block-heading" id="Where-Does-Data-Governance,-Data-Lakes-and-Data-Warehouses-Fit-In?">Where Does Data Governance, Data Lakes and Data Warehouses Fit In?&nbsp;</h2>



<p>While this article has focused on the difference between data meshes and data fabrics, other terms such as data lakes and data warehouses come up a lot in conversation &#8211; so where do they fit it?&nbsp;&nbsp;</p>



<p>Well, to answer the question, let&#8217;s have a look at how they supplement your wider data management processes.&nbsp;&nbsp;</p>



<ul class="wp-block-list">
<li><strong>Data Lakes and Data Warehouses.</strong> Both of these are used to store either relational and non-relational data from various sources. While historically used as disparate data sources, under a Data Fabric model, the use and control of data lakes and warehouses is more tightly aligned. &nbsp;</li>



<li><strong>Data Governance. </strong>This refers to your overarching approach for managing data, including the assurances in place to check quality and the management structures to make decisions. Both data meshes and data fabrics form part of your broader governance, shaping the way you manage data within your business.&nbsp;&nbsp;</li>
</ul>



<h2 class="wp-block-heading" id="The-Challenges-and-Considerations-of-Implementing-a-Data-Fabric-and-Data-Mesh">The Challenges and Considerations of Implementing a Data Fabric and Data Mesh&nbsp;</h2>



<p>While Data Mesh and Data Fabric provides a unified approach and exciting possibilities, implementing them isn&#8217;t as simple as flipping a switch. Here are some key challenges and considerations to keep in mind:&nbsp;</p>



<ol class="wp-block-list" start="1">
<li><strong>Cultural Shift:</strong> Both are two different approaches to data and require a significant change in how people think about and work. This can be met with resistance and requires careful change management.&nbsp;</li>



<li><strong>Skills Gap:</strong> Data Mesh is an emerging technology, so implementing and maintaining these systems requires new skills. Organizations need to invest in training or hiring people with the right expertise.&nbsp;</li>



<li><strong>Technology Investment:</strong> Data fabric is technology-centric and, as such, needs a significant investment in sophisticated tools and technologies to make it work.&nbsp;</li>



<li><strong>Scalability: </strong>While both approaches are designed to be scalable, actually scaling them in practice can be challenging, especially for large, complex organizations.&nbsp;</li>



<li><strong>Integration with Existing Systems:</strong> Data Mesh focuses on organizational system restructuring, but most organizations already have existing data systems in place. Integrating Data Mesh or Data Fabric with these existing systems can be complex.&nbsp;</li>



<li><strong>Measuring Success:</strong> It can be difficult to quantify the benefits of these approaches, especially in the short term. Organizations need to think carefully about how they&#8217;ll measure success and demonstrate value for money. &nbsp;</li>



<li><strong>Choosing the Right Approach:</strong> Data Mesh and Data Fabric aren&#8217;t one-size-fits-all solutions. Organizations need to carefully consider their specific needs and challenges to decide which approach (or combination of approaches) is right for them.&nbsp;</li>
</ol>



<h2 class="wp-block-heading" id="Realise-The-Benefits-of-Data-Mesh-&amp;-Data-Fabric-by-Working-With-an-Expert-Partner">Realise The Benefits of Data Mesh &amp; Data Fabric by Working With an Expert Partner&nbsp;</h2>



<p>The challenging of selecting the right approach can be a daunting one, and that&#8217;s why we&#8217;d always recommend working with a trusted IT partner. At Inetum, we&#8217;ve helped hundreds of clients transform the way they store, share, and use data, helping them reach new heights and win more business.&nbsp;</p>



<p>Our Data Consulting approach starts by helping you define a vision and strategy that&#8217;s right for your digital cultural, assets and internal skills. From there, we guide you through implementing sound data governance built around a data architecture that aligns to your local IT constraints, finances, and culture to transform the way you work.&nbsp;&nbsp;</p>



<figure class="wp-block-table"><table><tbody><tr><td></style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2024/06/BigCTA_MarekCzachorowski.jpg" alt="BigCTA MarekCzachorowski" title="Data Mesh vs Data Fabric: Their Purpose, Similarities, Differences &amp; How They Work Together  12"></div><div class="tile-content"><p class="entry-title client-name promotion-box__headline2">Sounds good, right?</p>
<p class="promotion-box__description2">You can see more about how we do it by scheduling one-to-one consultancy call or advisory call with our expert</p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Book a call</a></div></div></div></div></td></tr></tbody></table></figure>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/data-mesh-vs-data-fabric-a-guide-to-better-data-management/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Data Mesh: Unveiling the Future of Data Management </title>
		<link>https://nearshore-it.eu/articles/unveiling-data-mesh-architecture-principles/</link>
					<comments>https://nearshore-it.eu/articles/unveiling-data-mesh-architecture-principles/#respond</comments>
		
		<dc:creator><![CDATA[Piotr Rembowski]]></dc:creator>
		<pubDate>Tue, 30 Apr 2024 09:20:49 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Technologies]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=27324</guid>

					<description><![CDATA[What is Data Mesh? What is behind this concept? What are the assumptions behind it and how is it implemented? What are the advantages and disadvantages of this approach? Is it just another trend, or will it bring real benefits to your company?]]></description>
										<content:encoded><![CDATA[
<p>In the world of Big Data, it is necessary to choose the right architecture to allow for effective data management, scaling, and resource optimization. In recent years, Data Mesh architecture has become one of the leading concepts for Big Data. I hope that this article will provide answers to the above questions for all organizations looking for optimal data management solutions.</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#Evolution-of-data-architecture">1.  Evolution of data architecture</a></li>
                    <li><a href="#Data-Lake">2.  Data Lake</a></li>
                    <li><a href="#Data-Lake-Challenges">3.  Data Lakehouse </a></li>
                    <li><a href="#Why-Was-the-Data-Mesh-Concept-Created?">4.  Why Was the Data Mesh Concept Created?</a></li>
                    <li><a href="#Data-Mesh-Principles-Explained">5.  Data Mesh Principles Explained</a></li>
                    <li><a href="#Data-Mesh-benefits-">6.  Data Mesh benefits </a></li>
                    <li><a href="#Data-Mesh-challenges-">7.  Data Mesh challenges</a></li>
                    <li><a href="#Data-Mesh-–-is-this-for-me?">8.  Data Mesh – is this for me? </a></li>
                    <li><a href="#Implementing-Data-Mesh-strategy-–-how-to-get-started?">9.  Implementing Data Mesh strategy – how to get started? </a></li>
                    <li><a href="#Data-Mesh-Implementation-–-summary">10.  Data Mesh Implementation – summary</a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="Evolution-of-data-architecture-">Evolution of data architecture&nbsp;</h2>



<p>A lot has changed in the world of digital solutions in the context of data analysis. However, from a historical point of view, one thing is still the same. In order not to disrupt the operation of systems or applications during the analysis, the operational data is copied. What is constantly evolving is the target form of analytical data. &nbsp;</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="500" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_1.png" alt="data mesh" class="wp-image-27336" title="Data Mesh: Unveiling the Future of Data Management  13" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_1.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_1-300x198.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_1-495x327.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p>Initially (at the end of the last century), the destination was a<strong> data warehouse</strong> powered by ETL processes. The data was extracted (<strong>Extract</strong>) from the source systems, transformed into the target form (<strong>Transform</strong>), and loaded (<strong>Load</strong>) into the target destination – most often to fact tables and dimension tables forming a structure resembling a snowflake, optimized for quick reading and conducting specific analyses.</p>



<h3 class="wp-block-heading">Data Warehouse Challenges </h3>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="320" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_2.png" alt="data mesh" class="wp-image-27338" title="Data Mesh: Unveiling the Future of Data Management  14" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_2.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_2-300x127.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_2-495x210.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption">Fig. Data Warehouse </figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p>However, this approach caused some problems. When creating a warehouse, it was necessary to determine in advance the requirements for the analyzes that you want to conduct. ETL processes that transformed data<strong> cut out some of the attributes </strong>from sources, so we lost some of the information that could have been obtained. <strong>Time-to-Market</strong> is yet another issue. Firstly, it took a long time from the idea of the analysis to its implementation. And secondly, the data loading process itself often took a relatively long time, so the information was available with a delay. Scaling the solution was also problematic.  </p>



<h2 class="wp-block-heading" id="Data-Lake-">Data Lake&nbsp;</h2>



<p>The above problems, the growing scale of data, the cheaper storage space, and the growing popularity of cloud solutions led to the creation of a new type of architecture – the Data Lake.<br><br>In this case, we store the data in a raw, often unstructured form, which we later transform for all target analytical data solutions. Thus, the ETL process turned into an ELT one. This approach means that it is no longer necessary to know the requirements in advance and all attributes from the source data are still available.</p>



<h3 class="wp-block-heading" id="Data-Lake-Challenges">Data Lake Challenges&nbsp;&nbsp;</h3>



<p>However, the Data Lake architecture also has its drawbacks; for example, <strong>the complexity of data management, security problems, data standardization, and data quality issues. </strong>Without effective classification and access management, the data lake often turns into a &#8220;data swamp&#8221;, where we have the data, but nobody knows where to find it, whether or not it is complete and who is responsible for it.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="320" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_3.png" alt="data mesh" class="wp-image-27340" title="Data Mesh: Unveiling the Future of Data Management  15" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_3.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_3-300x127.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_3-495x210.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption">Fig. Data Lake</figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h2 class="wp-block-heading">Data Lakehouse&nbsp;</h2>



<p>A Data Lakehouse is a data architecture concept that combines the advantages of both the Data Lake and the Data Warehouse. At the same time, this approach allows us to address some of the disadvantages of both of them. Without losing raw data, we try to simplify data management by using data structures such as tables or views. This facilitates data management and improves safety with security features (e.g., column-level or row-level access controls).</p>



<p>In addition, the Data Lakehouse structures help to <strong>comply with data privacy regulations.</strong> With it, we can also standardize data processes and monitor the quality of data. There are also tools available to obtain <strong>ACID transactions, indexing, or caching of data, </strong>which means that we use a Data Lake as a Data Warehouse, keeping the advantages of both solutions.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="360" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_4.png" alt="data mesh" class="wp-image-27342" title="Data Mesh: Unveiling the Future of Data Management  16" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_4.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_4-300x143.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_4-495x236.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption">Fig. Data Lakehouse &nbsp;</figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h3 class="wp-block-heading">Data Catalog&nbsp;</h3>



<p>Regardless of the approach, we are still faced with the issue of the <strong>3 Vs (Volume, Variety and Velocity)</strong> as the amount of available data – both in terms of volume and diversity – is growing very quickly.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="300" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_5.png" alt="data mesh" class="wp-image-27344" title="Data Mesh: Unveiling the Future of Data Management  17" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_5.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_5-300x119.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_5-495x196.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption"><a href="https://www.techtarget.com/whatis/definition/3Vs" target="_blank" rel="noreferrer noopener">https://www.techtarget.com/whatis/definition/3Vs</a></figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p>If we want to keep control over all this, it becomes necessary to use<strong> </strong>a<strong> “data catalog”,</strong> a central location where information about the available data, its sources, structures, formats, meaning, and availability is stored. A data catalog allows users to easily search and discover datasets available in the organization. Thanks to the data catalog, users can quickly find the information they need and understand its context. In addition to the documentation role, these tools often also allow you to manage data access.</p>



<h2 class="wp-block-heading" id="Why-Was-the-Data-Mesh-Concept-Created?-">Revolutionizing Data Management with Data Mesh. Why Was the Data Mesh Concept Created?&nbsp;</h2>



<p>What other problems may we encounter? Many other questions arise, such as:&nbsp;</p>



<ul class="wp-block-list">
<li>Doesn&#8217;t one central data platform resemble a well-known monolithic system?&nbsp;&nbsp;</li>



<li>With the scale growing, does the data platform team have the time to integrate new data, or do they spend most of their time repairing the flows related to changes in existing data sources?</li>



<li>How well do these people know the source data and will they be able to model it well?&nbsp;&nbsp;</li>



<li>Who knows this data best?&nbsp;&nbsp;</li>
</ul>



<h3 class="wp-block-heading">Data Mesh Concept&nbsp;</h3>



<p>The answer to the above challenges may be the Data Mesh architecture, an approach that was proposed by Zhamak Dehghani. Data Mesh assumes the decentralization of data management and treating data as a product. Data Mesh is the answer to the challenges of managing data in large organizations, such as complexity, lack of scalability, and difficulties in data quality. </p>



<p>It is a similar approach to decomposing a monolith into microservices. Data Mesh is based on a combination of three concepts <strong>(Product Thinking, Platform Thinking, and Domain-Driven Design)</strong> and applies them to analytical data.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="520" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_6.png" alt="data mesh" class="wp-image-27346" title="Data Mesh: Unveiling the Future of Data Management  18" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_6.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_6-300x206.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_6-495x340.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption">Fig. Data Mesh Architecture&nbsp;</figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h3 class="wp-block-heading">Data Mesh definition&nbsp;</h3>



<p>The definition proposed by the concept&#8217;s author defines Data Mesh as</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p> &#8220;A decentralized sociotechnical approach in managing and accessing data at scale&#8221;.  </p>
</blockquote>



<p>This tells us that it is not only about technical issues, where we scale or optimize machines, but also changing the approach of people in the organization. As a result, we want to be sure that if the organization’s complexity grows, we will still be able to quickly obtain value from the data.</p>



<p>It is worth emphasizing that Data Mesh is a concept unrelated to any specific technology and can be implemented in many different ways.&nbsp;</p>



<h2 class="wp-block-heading" id="Data-Mesh-Principles-Explained">Data Mesh Principles Explained&nbsp;</h2>



<ul class="wp-block-list">
<li>Data Domains Ownership – decentralized and distributed responsibility for the data&nbsp;</li>



<li>Data as a Product&nbsp;&nbsp;</li>



<li>Self-Service Data Platform&nbsp;</li>



<li>Federated Data Governance&nbsp;</li>
</ul>



<h3 class="wp-block-heading">Data Domain Ownership, or the decentralization of analytical data </h3>



<p>This principle refers to the decomposition of analytical data into business domains and, more importantly, shifting responsibility for them to domain teams. Consequently, people from a given business area have the responsibility for and ownership of analytical data from their domain, because they understand this data best.  </p>



<p>Some data domains are based on operational data from the systems that produce the data, but others enrich their data with information from other domains. There can also be domains which combine and transform data (e.g. aggregate it) from other domains only.  </p>



<p>Thanks to this decomposition, one central data team is no longer a bottleneck.&nbsp;</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="410" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_7.png" alt="data mesh" class="wp-image-27348" title="Data Mesh: Unveiling the Future of Data Management  19" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_7.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_7-300x163.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_7-495x268.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption">Fig. Division into Domains&nbsp;&nbsp;</figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="510" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_8.png" alt="data mesh" class="wp-image-27350" title="Data Mesh: Unveiling the Future of Data Management  20" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_8.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_8-300x202.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_8-495x334.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption">Fig. Division into Domains&nbsp;&nbsp;</figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h3 class="wp-block-heading">Data as a Product Principle&nbsp;&nbsp;</h3>



<p>In the Data Mesh approach, data is treated as a product that we provide to (share with) others. This product has a product owner who is responsible for the quality, availability and security of the data. There is demand for the product outside the domain and there are &#8220;customers&#8221; interested in it – that is, recipients of our analytical data.</p>



<p>For example, a CRM domain has all customer information (their profile, change history, segment, etc.) and can share this information as a data product with the rest of the organization. This data can therefore be used by analysts to create dashboards for management, or by marketing specialists to prepare campaigns or find applications in controlling reports.</p>



<h4 class="wp-block-heading">Domain Data Team&nbsp;</h4>



<p>The domain team addresses the needs of other domains by providing high-quality, well-documented, and reliable data. Information on products throughout the organization should be easily accessible so that anyone who needs them can find them quickly. &nbsp;</p>



<h4 class="wp-block-heading">Contract&nbsp;</h4>



<p>Product data is made available through one or more<strong> output ports</strong> that implement the<a href="https://datacontract.com/" target="_blank" rel="noreferrer noopener"><strong> data contract</strong></a>.&nbsp;&nbsp;</p>



<p>The data contract is a document defining the structure, format, semantics, quality and terms of use of data between a data provider (a given domain) and &#8220;customers&#8221; (other domains). &nbsp;</p>



<p>The contract is a communication tool that enables a common understanding of the structure and interpretation of data. Thanks to their structure, contracts can also serve as a basis for code generation, testing, schema validation, quality control, monitoring and access control.&nbsp;</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="500" src="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_9.png" alt="data mesh" class="wp-image-27352" title="Data Mesh: Unveiling the Future of Data Management  21" srcset="https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_9.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_9-300x198.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/04/nearshore_2024.04.25_graphic_9-495x327.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /><figcaption class="wp-element-caption">Fig. Data Product&nbsp;</figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h3 class="wp-block-heading">Self-Service Data Platform&nbsp;</h3>



<p>Domain teams need an infrastructure and technical tools to create and share analytical data. That is why we need a shared data platform that can be used for all domains but respects their autonomy. It is known as a self-service platform because domain teams should be able to use it on their own to avoid creating another bottleneck.</p>



<p>By not implementing the principle of the self-service data platform, we could find ourselves in a situation in which each domain would build their own data platform.</p>



<p>As a result, we should have a dedicated data platform team that provides domain-independent features, tools, and systems that enable to create, run, and maintain interoperable data for all domains.</p>



<p>Read also: <a href="https://nearshore-it.eu/articles/technologies/distributed-systems-dapr/">Platform-independent distributed systems — key components of Dapr.io</a></p>



<h4 class="wp-block-heading">Data platform&nbsp;</h4>



<p>The data platform should make it possible to:</p>



<ul class="wp-block-list">
<li>Build an analytical data model </li>



<li>Acquire, and store data </li>



<li>Query the data (preferably combining data from different products) and create visualizations </li>



<li>Monitor data quality </li>



<li>Control access </li>



<li>Maintain common standards, policies or regulations (often automating these) </li>



<li>Make product information public so that other teams could discover it </li>



<li>Document product access and usage across domains </li>
</ul>



<p>It can be very useful to create templates that can be applied by new teams creating their &#8220;data products&#8221;. This will save time and money. &nbsp;</p>



<h3 class="wp-block-heading">Federated Governance&nbsp;&nbsp;</h3>



<p>This principle is about establishing common standards, policies and rules (regarding e.g. modelling, data quality, security, and documentation) that define how domain teams are to create and share their products. </p>



<p>Often such standards are created by a team consisting of different domains representatives. <strong>Unified interoperability rules</strong> that allow other domain teams to use data products are key. It is about determining: </p>



<ul class="wp-block-list">
<li>a uniform way of accessing data&nbsp;</li>



<li>data exchange formats<s>.</s>&nbsp;</li>



<li>global identifiers to combine data from different domain </li>



<li>the form of documentation describing the products </li>
</ul>



<p>It is also important to establish a common approach, in accordance with the implementation of established rules and policies (e.g., in the context of using personal data or data retention).</p>



<p>A very important role of the data platform is to automate the implementation of the above arrangements as much as possible, so that the platform itself allows for their application.</p>



<p>In conclusion, <strong>the most important rule is the one about the division into domains</strong>. It allows you to parallelize many works and maintain the effect of scale despite the growing complexity of the organization.</p>



<p>Wherever we distribute ownership and responsibility, there is a risk of chaos or anarchy, which applying the other three principles allows you to avoid. Thanks to them, it is possible to share data between domains, to cooperate, and prevent domain isolation.</p>



<h2 class="wp-block-heading" id="Data-Mesh-benefits-">Data Mesh benefits&nbsp;</h2>



<ul class="wp-block-list">
<li><strong>Scalability and flexibility</strong> – the team can independently manage their data, which they know very well. This allows them to adapt faster to changing business needs.&nbsp;</li>



<li><strong>Transferring responsibility for analytical data to their &#8220;producers&#8221;</strong> – no more searching for where the data comes from and which flow caused the problem.&nbsp;</li>



<li><strong>Improvement of data quality</strong> – Data Mesh introduces data management standards and procedures, which translates into more accurate information for business units.&nbsp;</li>



<li><strong>Facilitating cooperation and data exchange </strong>– despite data decentralization, Data Mesh allows you to maintain data consistency and integration by using data standards, APIs and services for communication between data teams.&nbsp;&nbsp;</li>



<li><strong>Greater space for innovation and experimentation</strong> – data decentralization in Data Mesh allows different teams to experiment with data and create innovative solutions without having to rely on central data resources. This promotes creativity and allows you to come up with new ideas faster.  </li>



<li><strong>Facilitated data discovery</strong> – this is made possible by publishing data contracts and sharing a data catalog that contains information about the available data<s> </s>sets in the organization. </li>



<li><strong>Better understanding of data connections </strong>– this is made possible thanks to breaking down a large, complex data model into many that are smaller and easier to manage.</li>
</ul>



<h2 class="wp-block-heading" id="Data-Mesh-challenges-">Data Mesh challenges&nbsp;&nbsp;</h2>



<ul class="wp-block-list">
<li><strong>Complexity of</strong> <strong>implementation </strong>– Data Mesh implementation can be complex and requires significant work in terms of transforming the organizational culture, adapting processes and changing infrastructure. </li>



<li><strong>Performance issues </strong>– in the case of combining data from many different domains.&nbsp;</li>



<li><strong>Duplication of data</strong> – it can cause problems with finding a single source of truth.&nbsp;</li>



<li><strong>Lack of a consistent approach to technology </strong>–<strong> </strong>individual domains can implement data as a product in different technologies.&nbsp;</li>



<li><strong>Additional responsibilities </strong>– system owners may not want to take on additional responsibilities related to building the analytical part.&nbsp;</li>



<li><strong>Lack of a broader perspective </strong>– individual domains may focus strongly on their data and lose a broader overview of the entire organization&#8217;s data.&nbsp;&nbsp;</li>



<li><strong>Duplication of solutions</strong> – deficiencies related to the data catalog or documentation may lead to creating similar solutions in different areas.&nbsp;&nbsp;</li>



<li><strong>Risk of chaos </strong>– shifting responsibility for analytical data to source systems without technological support may cause more chaos.&nbsp;&nbsp;</li>



<li><strong>The need to introduce all principals and organizational changes</strong> –<strong> </strong>investing only in the data platforms infrastructure or buying out-of-the-box tools supporting Data Mesh without organizational changes will not bring the expected results. Without shared global rules and standards, each domain will work most conveniently for them. This is not necessarily good from the point of view of the entire organization. Non-compliance with the Data Mesh rules will simply result in having areas without a data product approach and its advantages.</li>
</ul>



<h2 class="wp-block-heading" id="Data-Mesh-–-is-this-for-me?">Data Mesh – is it for me? </h2>



<p>You might be looking at the benefits of Data Mesh with enthusiasm. On the other hand, the list of potential issues and challenges is also long. However, many result from the lack of implementation (or implementation at an insufficient level) of all the rules. These principles are the pillars of Data Mesh. The way to implement the Data Mesh approach is not short and easy. But if your organization is faced with the challenges that led to the Data Mesh concept, it is worth considering using it.<br> <br>It may be helpful to ask yourself the following questions: </p>



<ul class="wp-block-list">
<li>Is my organization large, complex, and has many different data sources?&nbsp;</li>



<li>Is my organization already Domain-Oriented?&nbsp;</li>



<li>Do we use modern data solutions, CI/CD, <a href="https://nearshore-it.eu/articles/project-management-leadership/software-development-life-cycle-devops/" data-type="post" data-id="10050">DevOps</a>, data on cloud? </li>



<li>Do we use a <a href="https://nearshore-it.eu/articles/data-driven-decision-making/" data-type="post" data-id="27200">Data-Driven strategy</a> (ML and advanced analytics)? </li>



<li>Do we have management support?&nbsp;</li>



<li>Are we aware that the implementation of Data Mesh is a long-term process?&nbsp;</li>



<li>Do we have the technical capability to build a data platform?&nbsp;</li>
</ul>



<p>If the answer to most of them is <strong>yes,</strong> then the Data Mesh approach will be helpful.  </p>



<h2 class="wp-block-heading" id="Implementing-Data-Mesh-strategy-–-how-to-get-started?">Implementing Data Mesh strategy – how to get started </h2>



<p>It is worth starting off the implementation of the Data Mesh approach with small <strong>specific data use cases,</strong> in teams that are interested and enthusiastic about this topic. If possible, it is worth using the existing technology and infrastructure to implement this approach to data. </p>



<p>Small steps will allow you to prove that this approach works and brings value to the organization. With use cases, it is easier to get management&#8217;s support and obtain an investment budget to implement a Data Mesh approach in other systems. It may also be helpful to create a team of data enablers that will support new domain teams by sharing examples, templates, or best practices.</p>



<figure class="wp-block-table"><table><tbody><tr><td></style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2023/09/2020.12.09_jpro_cover-2.jpg" alt="2020.12.09 jpro cover 2" title="Data Mesh: Unveiling the Future of Data Management  22"></div><div class="tile-content"><p class="entry-title client-name">MODERN DATA SOLUTIONS</p>

<h3>Increase efficiency of your business </h3>
Use data to improve efficiency and launch new opportunities in all areas of your company with Inetum DATA & ANALYTICS SERVICES!
<a class="btn btn-primary" href="https://nearshore-it.eu/modern-data-solutions/" target="_blank" rel="noopener">Get started now!</a>



</div></div></div></div></td></tr></tbody></table></figure>



<h2 class="wp-block-heading" id="Data-Mesh-Implementation-–-summary">Data Mesh implementation – summary </h2>



<p>We all know the potential of data. With data, you discover new information, make better business decisions, understand trends, predict behaviors and create new products. Data is one of the most valuable resources in the world today, provided we can arrange it properly to extract information and value quickly and easily from it. For this, it is necessary to choose the right architecture that will allow you to effectively manage data, scale, optimize resources, and respond to business needs. </p>



<p>The Data Mesh architecture stands out among the available options and is an ideal solution that should be of interest to any large domain-oriented organization with Big Data from various sources.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/unveiling-data-mesh-architecture-principles/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Qlik + Talend – the future of data integration</title>
		<link>https://nearshore-it.eu/articles/qlik-talend-the-future-of-data/</link>
					<comments>https://nearshore-it.eu/articles/qlik-talend-the-future-of-data/#respond</comments>
		
		<dc:creator><![CDATA[Katarzyna Warchol]]></dc:creator>
		<pubDate>Thu, 14 Mar 2024 05:27:14 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Technologies]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=27149</guid>

					<description><![CDATA[Data integration is a key element of understanding data. Discover new capabilities offered by Qlik Talend and advanced tools and solutions for seamlessly combining, transforming, and using data. ]]></description>
										<content:encoded><![CDATA[
<p>In a dynamic technological environment, data has become a key resource for organizations. Not only collecting a large amount of data<ins>,</ins> but also understanding it properly, is an important challenge. Data integration is a key element here, a humble hero that combines various data sources into a consistent whole. Qlik Talend introduces new capabilities by offering advanced tools and solutions for seamlessly combining, transforming and using data. Read the article to find out what Qlik and Talend can offer companies that hope to be data-driven pioneers.</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#What-is-data-integration?">1.  What exactly is data integration?</a></li>
                    <li><a href="#Why-is-data-integration-so-important?">2.  Why is data integration so important?</a></li>
                    <li><a href="#What-is-the-Talend-tool?">3.  What is the Talend tool?</a></li>
                    <li><a href="#Qlik-and-Talend-join-forces">4.  Qlik and Talend join forces</a></li>
                    <li><a href="#Talend-(Qlik)-in-the-Gartner-ranking">5.  Talend (Qlik) in the Gartner ranking</a></li>
                    <li><a href="#Where-is-the-Talend-strategy-heading?">6.  Where is the Talend strategy heading?</a></li>
                    <li><a href="#Summary">7.  Summary</a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="What-is-data-integration?">What is data integration?</h2>



<p>Data Integration is the process of merging information from different sources to create a coherent dataset that allows users to easily access a variety of information and meets the needs of different applications and business processes. This is an important element of data governance, especially in the context of the growing need to integrate large data sets such as data warehouses, data lakes or data lakehouses.</p>



<p>This process is known as <strong>ETL </strong>and involves several steps, such as:</p>



<ul class="wp-block-list">
<li><strong>Extract</strong> – data extraction: retrieving data from various sources such as databases, files, applications, web services, etc.</li>



<li><strong>Transform</strong> – data transformation: converting and standardizing data into one format and structure to ensure consistency and compatibility. This may include cleaning, filtering, aggregation, and data enrichment.</li>



<li><strong>Load</strong> – data loading: transferring processed data to the target system, e.g., a data warehouse, data lake<ins>,</ins> or analytical database, where it is stored and made available for analysis and reporting.</li>
</ul>



<p>Data Integration experts design tools and platforms to facilitate the automation of this process, enabling data to be effectively combined and redirected from different source systems to the target ones.</p>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2024/06/BigCTA_MarekCzachorowski.jpg" alt="BigCTA MarekCzachorowski" title="Qlik + Talend – the future of data integration 23"></div><div class="tile-content"><p class="entry-title client-name promotion-box__headline2">Elevate Your Data Strategy</p>
<p class="promotion-box__description2">Our customized Data solutions align with your business objectives. Consult with <strong>Marek Czachorowski</strong>, Head of Data and AI Solutions, for expert guidance.</p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Schedule a meeting</a></div></div></div></div>



<h2 class="wp-block-heading" id="Why-is-data-integration-so-important?"><a></a>Why is data integration so important?</h2>



<p>Organizations often struggle with huge and complex sets of data from a variety of unrelated sources – advertising platforms, CRM systems, marketing automation tools, web analytics, financial systems, partner data, and even real-time data. Without data integration, analysts and developers would have to spend many hours preparing data for each report, which would make it impossible to combine all this information to achieve the final result. Data integration is crucial for companies that want to understand their data and make informed decisions on that basis. It allows<ins> </ins>them to improve <a href="https://nearshore-it.eu/articles/what-is-data-quality/">data quality</a>, increase operational efficiency<ins>,</ins> and support innovation.</p>



<p>With integrated data, organizations can better understand their customers, identify trends, optimize processes, and respond quickly to changing market conditions.</p>



<p>Ultimately, data integration breaks down data silos and enables analysis and operations based on a single, trustworthy, centralized data source that you can rely on.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="1154" height="558" src="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_1.png" alt="nearshore 2024.03.15 graphic 1" class="wp-image-27158" title="Qlik + Talend – the future of data integration 24" srcset="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_1.png 1154w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_1-300x145.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_1-768x371.png 768w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_1-495x239.png 495w" sizes="auto, (max-width: 1154px) 100vw, 1154px" /><figcaption class="wp-element-caption"><a href="https://res.cloudinary.com/talend/image/upload/w_1408/q_auto/qlik/Product%20(small)/Talend%20Data%20Preparation/Talend-data-preparation_740x550_2x_ffjnsi.jpg" target="_blank" rel="noreferrer noopener">https://res.cloudinary.com/talend/image/upload/w_1408/q_auto/qlik/Product%20(small)/Talend%20Data%20Preparation/Talend-data-preparation_740x550_2x_ffjnsi.jpg</a></figcaption></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="1202" height="693" src="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_2.png" alt="nearshore 2024.03.15 graphic 2" class="wp-image-27156" title="Qlik + Talend – the future of data integration 25" srcset="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_2.png 1202w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_2-300x173.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_2-768x443.png 768w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_2-495x285.png 495w" sizes="auto, (max-width: 1202px) 100vw, 1202px" /><figcaption class="wp-element-caption"><a href="https://www.g2.com/products/talend-cloud-data-integration/reviews" target="_blank" rel="noopener">https://www.g2.com/products/talend-cloud-data-integration/reviews</a></figcaption></figure>
</div>


<p class="has-text-align-center"></p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="1168" height="716" src="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_3.png" alt="nearshore 2024.03.15 graphic 3" class="wp-image-27154" title="Qlik + Talend – the future of data integration 26" srcset="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_3.png 1168w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_3-300x184.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_3-768x471.png 768w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_3-495x303.png 495w" sizes="auto, (max-width: 1168px) 100vw, 1168px" /></figure>
</div>

<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="996" height="562" src="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_4.png" alt="nearshore 2024.03.15 graphic 4" class="wp-image-27152" title="Qlik + Talend – the future of data integration 27" srcset="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_4.png 996w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_4-300x169.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_4-768x433.png 768w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_4-495x279.png 495w" sizes="auto, (max-width: 996px) 100vw, 996px" /><figcaption class="wp-element-caption"><a href="https://dbmstools.com/tools/talend-data-fabric/" target="_blank" rel="noopener">https://dbmstools.com/tools/talend-data-fabric</a></figcaption></figure>
</div>


<h2 class="wp-block-heading" id="What-is-the-Talend-tool?"><a></a>What is the Talend tool?</h2>



<p>Now that I have explained why data integration is so important, it is time to move on to specific solutions that make this process easier. One of them is the Talend platform, whose manufacturer specializes in data integration and management software.</p>



<p>Talend offers a unique end-to-end platform that combines advanced capabilities to integrate data from different systems and applications and clean them up, thus providing the ability to manage them in different cloud, hybrid<ins>,</ins> or multi-cloud environments.</p>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2023/02/FotoInetum_abstract18.jpg" alt="FotoInetum abstract18" title="Qlik + Talend – the future of data integration 28"></div><div class="tile-content"><p class="entry-title client-name">DATA ANALYTICS SERVICES</p>

<h3>Use data to your advantage</h3>
Discover our Data Management End-to-End offer!
<a class="btn btn-primary" href="https://nearshore-it.eu/modern-data-solutions/" target="_blank" rel="noopener">Find out more!</a></div></div></div></div>



<h2 class="wp-block-heading" id="Qlik-and-Talend-join-forces"><a></a>Qlik and Talend join forces</h2>



<p>Qlik announced the acquisition of Talend on May 16, 2023, expanding the opportunities for modern enterprises to access, transform, and analyze data. Qlik&#8217;s high<del> </del>priority strategic project has been finalized, so now Qlik can deliver best-in-class integration, data quality, and analytics solutions.</p>



<p>“Qlik, together with Talend, will bring significant benefits to customers, including expanded product offerings, enhanced support and services, and increased investments in innovation and R&amp;D,” said Mike Capone, CEO of Qlik<em>.</em></p>



<p><em>“Qlik&#8217;s broad expertise in data integration, analytics, AI and machine learning combined with Talend&#8217;s data integration and data quality solutions, will provide customers the most comprehensive solution in the industry”.</em></p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>Expert insight</p>



<p>Szymon Serwin, Qlik Technical Leader:</p>



<p></p>
<cite>“Qlik is famous for the fact that if they decide on any form of cooperation, partnership or merger of companies, they do it with the potential for its customers in mind. In this case, Talend brings capabilities to work with data properly prepared &#8220;at a lower level&#8221; than the data visualization layer represented by the flagship product, which is Qlik Sense. This allows you to correctly prepare the data and thus relieve Qlik itself <a>from </a>tasks related to the ETL layer and shift attention to the interaction of the tool directly with the end user.<br><br>This is especially important in the context of the increasingly popular Qlik Cloud, which has some limitations when it comes to the volumes of individual applications. Therefore, Talend will allow Qlik to better serve not only huge corporate environments, where the multitude of data sources and their volume require mandatory integration and appropriate preparation, but also smaller environments based 100% on Qlik Cloud. In my opinion, this transaction is a hit, which will allow for even greater complementarity of Qlik&#8217;s offer&#8221;.</cite></blockquote>



<h2 class="wp-block-heading" id="Talend-(Qlik)-in-the-Gartner-ranking"><a></a>Talend (Qlik) in the Gartner ranking</h2>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="660" src="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_5.png" alt="nearshore 2024.03.15 graphic 5" class="wp-image-27150" title="Qlik + Talend – the future of data integration 29" srcset="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_5.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_5-300x262.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.15_graphic_5-452x395.png 452w" sizes="auto, (max-width: 756px) 100vw, 756px" /></figure>
</div>


<p>After the merger, Qlik + Talend holds a leading position in many market categories. For seven years in a row, Talend was among the leaders in Gartner&#8217;s Magic Quadrant for data integration tools, and in five consecutive <a>years </a>it was among the leaders in the context of data quality solutions.<br><br>Qlik, in turn, has been a leader in the Magic Quadrant in the categories of analytical platforms and Business Intelligence for 13 years in a row. In addition, Qlik was recognized as a leader in the IDC MarketScape ranking: American Business Intelligence and Data Analytics Platforms in 2022.</p>



<p>According to Gartner, Talend offers a rich set of data integration capabilities, including <strong>CDC (Change Data Capture)</strong>, which refers to tracking changes in data sources. It also allows organizations to gain consistency of information. Thanks to the capabilities associated with working in the cloud, it is popular among companies with hybrid or multi-cloud systems. Its technology platform supports various roles within the organization, as well as providing data cataloging.</p>



<p>Talend Data Fabric uses metadata analysis to provide developers with easy access to key data quality information. Thanks to the &#8220;trust score&#8221; function, users receive a clear assessment of the importance, completeness, discoverability<ins>,</ins> and popularity of various data sets in the Talend Data Inventory module.</p>



<p>Additionally, various products within the Talend platform offer quality-related features, such as semantic-based data quality assessment, to make it easier for users to work with data at every stage of the process.</p>



<p>Talend is constantly developing its CDC capabilities, focusing on providing solutions for complex topologies, enabling real-time data sharing between enterprises and database synchronization.</p>



<p><strong>Read also:</strong> <a href="https://nearshore-it.eu/articles/qlik-sense-saas-cloud-analytics/">Qlik Sense SaaS. Cloud Analytics for enterprises</a></p>



<h2 class="wp-block-heading" id="Where-is-the-Talend-strategy-heading?"><a></a>Where is the Talend strategy heading?</h2>



<p>The Qlik Talend roadmap covers several key areas that will have a significant impact on the development roadmap.</p>



<ul class="wp-block-list">
<li><strong>Expand data integration capabilities</strong> – the main goal is to continuously expand data integration capabilities to provide customers with comprehensive tools to combine data from different sources. At the same time<ins>,</ins> there is a focus on improving data quality to ensure accuracy, consistency<ins>,</ins> and readiness for analysis.</li>



<li><strong>Working in the cloud </strong>– the development of cloud capabilities is also an important element of the strategy, which will allow customers to remain flexible and scalable in data management. One of the key partners is Microsoft, where Talend offers hundreds of connectors and components for data storage and management.</li>



<li><strong>Integration with business tools </strong>– it is no less important to facilitate integration with analytical tools so that customers can gain insight into valuable data. Currently, Talend provides connections to applications such as Salesforce, Workday, Dynamics 365, Google Analytics<ins>,</ins> and others.</li>



<li><strong>Research &amp; Development</strong> – another strategic goal is to continuously invest in research and development to introduce innovative solutions that will address the growing needs of customers and changing technological trends.</li>
</ul>



<h2 class="wp-block-heading" id="Summary"><a></a>Summary</h2>



<p>Currently, many products are available on the data analysis tools market, albeit none of them alone are able to fully respond to the growing data analysis needs.</p>



<p>In its vision, Qlik focuses on innovation, continuous development, facilitating work in the cloud<ins>,</ins> and better integration of tools. The acquisition of Talend enriches Qlik&#8217;s offer with solutions for integration and good quality data in the cloud, strengthening the company&#8217;s position as a leader in these areas.</p>



<p>With the combination of Qlik and Talend, customers can eliminate technical costs while ensuring that their data is available at key decision-making moments.</p>


</style><div class="promotion-box promotion-box--image-left promotion-box--full-width-without-image"><div class="tiles latest-news-once"><div class="tile"><div class="tile-content"><p class="promotion-box__description2"><strong>Consult your project directly with a specialist</strong></p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Book a meeting</a></div></div></div></div>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/qlik-talend-the-future-of-data/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>What is Data Quality? Data Management Best Practices &#038; More</title>
		<link>https://nearshore-it.eu/articles/what-is-data-quality/</link>
					<comments>https://nearshore-it.eu/articles/what-is-data-quality/#respond</comments>
		
		<dc:creator><![CDATA[-- Nie pokazuj autora --]]></dc:creator>
		<pubDate>Thu, 07 Mar 2024 11:23:39 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Best practices]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=27040</guid>

					<description><![CDATA[What is data quality? Read the article and learn why it's so important, the benefits of good quality data, and some tips and tricks to help you get started on the path to quality data immediately!]]></description>
										<content:encoded><![CDATA[
<p>With <a href="https://www.domo.com/learn/infographic/data-never-sleeps-5" target="_blank" rel="noopener">2.5 quintillion bytes</a> worth of data generated each day, companies that optimize and leverage data-led insights stay ahead of the competitive curve. While most of us already use systems that store and structure data, the quality of that data simply isn&#8217;t good enough, with many<a href="https://www.montecarlodata.com/blog-data-quality-survey" target="_blank" rel="noopener"> companies reporting</a> lost revenue, slow response times, and poor decision-making as the consequences of messy master data sets.</p>



<p>But, the good news is that data accuracy can easily be improved by implementing various best practice processes, systems, and governance steps. In this article, we look at all things data quality, including why it&#8217;s so important, the benefits of good&nbsp;quality data, and some tips and tricks to help you get started on the path to high-quality data immediately.</p>



<p>Let&#8217;s get started!</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#What-is-Data-Quality-and-Data-Quality-Management?">1.  What is Data Quality and Data Quality Management?</a></li>
                    <li><a href="#Why-is-Data-Quality-and-Data-Integrity-Important?">2.  Why is Data Quality and Data Integrity Important?</a></li>
                    <li><a href="#Most-Common-Data-Quality-Issues">3.  The 4 Most Common Data Quality Issues</a></li>
                    <li><a href="#The-Benefits-of-Good-Data-Quality">4.  The Benefits of Good Data Quality</a></li>
                    <li><a href="#Best-Practices-to-Improve-Data-Quality">5.  The 6 Dimensions of Data Quality &#8211; Best Practices to Improve Data Quality</a></li>
                    <li><a href="#-6-Data-Quality-Tools,-Techniques,-and-Processes">6.  Here&#8217;s How to Get Started with Data Governance &#8211; 5 Data Quality Tools, Techniques, and Processes</a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="What-is-Data-Quality-and-Data-Quality-Management?">What is Data Quality and Data Quality Management?</h2>



<p>Data quality refers to how well a dataset meets criteria for areas such as accuracy, completeness, and validity. As part of a broader data governance strategy, if your data quality is high, it gives your organization the confidence to rely on the data to measure performance, analyze productivity, and make business decisions.</p>



<p>To keep different data sets in good shape, many organizations deploy various data quality management tactics to maintain data standards. While we&#8217;ll look at these in more detail later on, data quality management uses a mix of people, processes, and governance steps to monitor data values and continually optimize data entry techniques for greater and greater quality.</p>



<h2 class="wp-block-heading" id="Why-is-Data-Quality-and-Data-Integrity-Important?">Why is Data Quality and Data Integrity Important?</h2>



<p>As AI-led big data systems, underpinned by data lakes and data warehouses, become increasingly popular for modern businesses, the saying &#8216;rubbish in, rubbish out&#8217; is on the minds of many CTOs and CIOs. Poor quality data drives poor quality outcomes, so effective data quality management is essential to maintain data integrity, stay ahead of the game, and enable your business strategy.&nbsp;</p>



<p>Fail to maintain good data quality standards, and you could find yourself:</p>



<ul class="wp-block-list">
<li><strong>Repeatedly making poor business decisions</strong>, driven by inaccurate enterprise data analysis</li>



<li><strong>Missing opportunities</strong> to drive additional revenue&nbsp;</li>



<li><strong>Unable to respond to changes</strong> in your industry ahead of your competitors</li>



<li><strong>Open to fines and data breaches</strong>, especially if you regularly handle customer data or medical patient data&nbsp;</li>
</ul>



<p>To bring this to life even further, in 2021, consulting firm<a href="https://www.gartner.com/smarterwithgartner/how-to-improve-your-data-quality" target="_blank" rel="noopener"> Gartner</a> found that <strong>bad data quality costs organizations an average of $12.9 million per year.</strong> That&#8217;s a lot of money to lose all because of inaccurate data and poor data quality control across your organization!</p>



<p><strong>Also read:</strong>&nbsp; <a href="https://nearshore-it.eu/articles/technologies/data-mining-methods/">Data mining methods</a>&nbsp;</p>



<h2 class="wp-block-heading" id="Most-Common-Data-Quality-Issues">The 4 Most Common Data Quality Issues</h2>



<p>It&#8217;s easy for us to sit here and recommend you improve your data quality, but no one purposefully sets out to have<strong> poor-quality data</strong> &#8211; unfortunately, high-quality, consistent data is simply hard to achieve.<br><br>But, there are some common reasons businesses struggle with data quality problems. Let&#8217;s take a look at the most common data consistency issues.</p>



<ul class="wp-block-list">
<li>Businesses use a range of different systems that each hold data in different formats and structures, creating inconsistent data standards.</li>



<li>Businesses take data from multiple data sources, each using its own data types, data pipelines, data formats, and business rules.</li>



<li>Businesses run many projects to move between different systems. Regular data migration can erode quality effort, making maintaining a trusted data set hard.</li>



<li>The legislation of particular countries or regions regularly changes, meaning data records constantly need updating, which opens the door to errors and inconsistency.</li>



<li>Staff simply aren&#8217;t educated on maintaining high data quality, leading to the quality standard eroding over time.</li>
</ul>



<p>Even though many businesses use data analytics to drive business decisions, to maintain data quality, you need to overcome the common challenges that affect every aspect of data. But if you make a fresh start and manage to get new data in good shape, here are some of the benefits you can uncover.</p>



<h2 class="wp-block-heading" id="The-Benefits-of-Good-Data-Quality">The Benefits of Good Data Quality</h2>



<p>Now that we know the consequences of bad data quality, it&#8217;s time to look on the bright side. Improving the quality of data stores across your business will allow you to level up your business operations by:</p>



<ul class="wp-block-list">
<li><strong>Improving productivity </strong>through automated business rules</li>



<li><strong>Enhancing the service </strong>you can provide to your customers, driving increased revenue opportunities</li>



<li><strong>Reducing costs </strong>thanks to faster decision-making and less duplication</li>



<li><strong>Creating consistency </strong>across processes and systems</li>



<li><strong>Easily demonstrating compliance</strong> thanks to embedded quality assurance frameworks&nbsp;</li>



<li>And many more!</li>
</ul>



<p>Data quality improvement will benefit every corner of your business, so it pays to invest time and effort into a data quality framework that you can implement across the business. To help you begin to address data problems in your business, let&#8217;s look at some data management best practices. </p>



<h2 class="wp-block-heading">Enhancing Strategic Decision-Making with Quality Data</h2>



<p>Understanding the depth of data-driven decision making is crucial for businesses aiming to fully harness the power of high-quality data. It&#8217;s not just about having data but making sure that data is of the quality required to produce reliable, actionable insights that influence critical business decisions. Integrating data quality with data-driven decision strategies ensures that the information used is accurate, timely, and relevant, thereby enhancing the effectiveness of strategic choices. For more on how to effectively apply these practices within your organization, read our in-depth exploration on <a href="https://nearshore-it.eu/articles/data-driven-decision-making/">data-driven decision making.</a></p>



<h2 class="wp-block-heading" id="Best-Practices-to-Improve-Data-Quality">The 6 Data Quality Dimensions &#8211; Best Practices to Improve Data Quality</h2>



<p>Before embarking on your own data quality improvement process, it&#8217;s worth stepping back to look at the fundamentals of a good data governance program.</p>



<p>This begins with the six data quality dimensions. Each views data quality through a different lens, helping you consider all angles to ensure your data is fit for purpose. Let&#8217;s take a look at each in turn.&nbsp;</p>



<ul class="wp-block-list">
<li><strong>Completeness: </strong>Completeness measures the amount of data records that are complete and ready to be used. Partially complete data may lead to unreliable results that are not fully representative of the analysis you&#8217;re trying to complete.<br></li>



<li><strong>Uniqueness:</strong> Data sets have a habit of duplicating themselves, so uniqueness measures the percentage of duplicate data in your sets. To avoid misrepresentation, tactics such as data cleansing and data quality rules help ensure that records are unique and individual within your sets.<br></li>



<li><strong>Validity:</strong>When it comes to using the data in real time, the validity of data measures how much of the data meets the required format for a particular business rule. Often, this refers to data being in the right format, pattern, or range to make it applicable to a given situation.<br></li>



<li><strong>Timeliness:</strong> In an ever-digital world, data must be available at the click of a button. Real-time data processing is a demand of many businesses, so when managing data, you have to ensure it can be generated, received, and manipulated at exactly the time it&#8217;s needed.<br></li>



<li><strong>Accuracy:</strong> In a world where many systems use the same sets of data, accuracy refers to how reliable the data is versus the agreed &#8216;source of truth&#8217;.<br></li>



<li><strong>Consistency:&nbsp;</strong>Building on accuracy, while one system often leads the way to ensure reliable data, other systems will conduct data quality checks to measure data quality against the master source. This is what consistency is all about, ensuring that, as well as using other data quality metrics to ensure data quality at an enterprise level using rules, audits, and data quality assessments.<br></li>



<li><strong>Fitness for purpose:</strong> Finally, effective data is only effective if it helps enable the business to do its job. While the accuracy of the data could be 100% if no one needs it, it&#8217;s a waste of time, effort, and money.</li>
</ul>



<p>The dimensions of data quality are the underpinning foundations of great data governance. They should be applied no matter the size and shape of your data sets and no matter where the data is located.&nbsp;</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="490" src="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.07_graphic_1.png" alt="data quality" class="wp-image-27041" title="What is Data Quality? Data Management Best Practices &amp; More 30" srcset="https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.07_graphic_1.png 756w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.07_graphic_1-300x194.png 300w, https://nearshore-it.eu/wp-content/uploads/2024/03/nearshore_2024.03.07_graphic_1-495x321.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h2 class="wp-block-heading" id="-6-Data-Quality-Tools,-Techniques,-and-Processes">Here&#8217;s How to Get Started with Data Governance &#8211; 6 Data Quality Tools, Techniques, and Processes</h2>



<p>Now that we know data quality is important for business performance and the dimensions to build trust in data, it&#8217;s time to build out your own data quality assessment framework. This will help you continuously root out inaccurate or inconsistent data and make positive strides to fix it for the benefit of your entire organization.</p>



<p><strong>Let&#8217;s take a look at the 6-step process.</strong></p>



<h3 class="wp-block-heading">#1 &#8211; Hire Dedicated Data Stewards</h3>



<p>If you want to get serious about data quality, it all starts with getting the right people on board who are dedicated to improving the state of data in your organization. Many businesses start by hiring a Data Steward who&#8217;s ultimately responsible for leading the governance approach to data quality and improving the accuracy of data in your organization.</p>



<p>While the best results come from hiring dedicated stewards and data managers, it could also be a side-of-desk activity for data scientists, architects, or information security professionals. This will help you get off to a good start if you don&#8217;t have the budget to take on a new headcount.&nbsp;</p>



<p>Either way, once the roles are assigned, invest in adequate training and awareness of data quality to ensure they have all the knowledge they need to start promoting data quality best practices.</p>



<h3 class="wp-block-heading">#2 &#8211; Agree on a Data Profiling Strategy</h3>



<p>With the right people in place, it&#8217;s time to start building your data profiling strategy. Data profiling is the process of examining, analyzing, and creating a summary of the data you hold. Put simply, this is the process you&#8217;ll go through to assess the quality of your data.&nbsp;</p>



<p>Within your strategy, you&#8217;ll want to define things such as:&nbsp;</p>



<ul class="wp-block-list">
<li>How often you&#8217;ll analyze data based in different locations</li>



<li>The techniques/lenses you&#8217;ll use to identify data</li>



<li>How you&#8217;ll measure the data accuracy</li>



<li>The level of data quality that&#8217;s acceptable for your business</li>



<li>How you&#8217;ll report data quality issues to other departments, such as IT, Risk, and Compliance</li>
</ul>



<p>Once you have a clear view of your strategy, it&#8217;s time to put your data quality measures into action.</p>



<h3 class="wp-block-heading">#3 &#8211; Go Hunting for Data Quality Problems</h3>



<p>Now, it&#8217;s time for action as you hunt for data quality problems within your organization. You&#8217;ll need to plan exactly how you&#8217;ll do this, arranging things like data access requests, access to systems and tools, and time with relevant stakeholders to understand how data is used in a particular area.&nbsp;</p>



<p>It&#8217;s best to have a clear data governance process to accompany this. This will help you keep track of what&#8217;s happening when and ensure you have oversight and support if you encounter any problems.</p>



<p>Once you&#8217;ve completed your assessment, you&#8217;ll likely generate an outcome report to share with stakeholders. This will clearly show the level of data quality in that particular area, including an objective pass/fail assessment and recommendations for improvement.&nbsp;</p>



<h3 class="wp-block-heading">#4 &#8211; Build an Expert Data-Fixing Taskforce</h3>



<p>With a list of recommendations, it&#8217;s time to get the right people together to fix the problems. Depending on the size and scope of your data quality team, this may be done yourself or passed onto the business area that owns the data to take action themselves.&nbsp;</p>



<p>The latter aligns with <a href="https://nearshore-it.eu/articles/master-data-management-what-is-mdm/">a master data management</a> approach, whereby business and IT professionals work together to ensure data quality remains high across all applications and data sets. Given that poor data quality can result in various negative consequences, it makes sense for everyone to pull together to ensure the data is fit to serve the business operations.&nbsp;</p>



<h3 class="wp-block-heading">#5 &#8211; Audit the Process with Quality Assurance Activities</h3>



<p>Like all good compliance activities, you need to add a third-party review to ensure your data governance framework is adhered to. Whether you team up with a formal auditing team or set up your own local quality assurance, you have to keep your data stewards, business owners, and application teams accountable for data quality best practices.&nbsp;</p>



<p>Many organizations track a schedule of data quality reviews, ensuring each is completed on time, to a good standard, and that any follow-on actions are completed to a high standard. This helps keep bad data at bay, ensuring that every component of the overall data quality governance model is effective.&nbsp;</p>



<h3 class="wp-block-heading">#6 &#8211; Track It All With a Data Quality Management Tool</h3>



<p>Where many businesses start and fail with data quality is by trying to do it all manually. Especially when dealing with large data sets, a data quality solution or tool can help automate a lot of manual work, increasing the speed and accuracy of your analysis activities.&nbsp;</p>



<p>Popular tools such as <strong>OpenRefine</strong>, <strong>Talend</strong>, and <strong>Cloudingo </strong>are perfect for executing your data profiling strategy, helping you identify, analyze, clean, and re-format data to boost quality and provide confidence to business leaders when making decisions.&nbsp;</p>



<p>While these tools require investment and setup, they&#8217;ll help take your data quality capabilities to the next level as you standardize and assure your organization&#8217;s entire enterprise data set.&nbsp;</p>



<h2 class="wp-block-heading">Businesses succeed and fail based on their data quality</h2>



<p>In a world where data is at the heart of everything we do, if your data quality is low, it can leave you at a real disadvantage. If you want to avoid risk, poor decision-making, and slow performance, you need to invest in your ability to maintain high levels of data quality.&nbsp;</p>



<p>The tips in this guide are a great place to start, but for the best results across all of your data needs, we suggest partnering with a technology expert.<strong> At Inetum, our Smart Data offering helps organizations worldwide define, implement, and achieve their data strategies, including initial visioning, change management, and long-term data governance.&nbsp;</strong></p>



<p>If you&#8217;d like to join the hundreds of customers who benefit from our expertise, knowledge, and partnership, reach out today to understand how we can help level up your enterprise data quality.&nbsp;</p>



<figure class="wp-block-table"><table><tbody><tr><td></style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2023/02/FotoInetum_abstract18.jpg" alt="FotoInetum abstract18" title="What is Data Quality? Data Management Best Practices &amp; More 31"></div><div class="tile-content"><p class="entry-title client-name">DATA ANALYTICS SERVICES</p>

<h3>Use data to your advantage</h3>
Discover our Data Management End-to-End offer!
<a class="btn btn-primary" href="https://nearshore-it.eu/modern-data-solutions/" target="_blank" rel="noopener">Find out more!</a></div></div></div></div></td></tr></tbody></table></figure>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/what-is-data-quality/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Big Data in healthcare: management, analysis and future prospects for healthcare organizations</title>
		<link>https://nearshore-it.eu/technologies/big-data-in-healthcare/</link>
					<comments>https://nearshore-it.eu/technologies/big-data-in-healthcare/#respond</comments>
		
		<dc:creator><![CDATA[-- Nie pokazuj autora --]]></dc:creator>
		<pubDate>Thu, 30 Nov 2023 10:13:47 +0000</pubDate>
				<category><![CDATA[Technologies]]></category>
		<category><![CDATA[Articles]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=26368</guid>

					<description><![CDATA[Big Data in healthcare holds transformative potential, streamlining operations and advancing patient care. Join the journey into data-driven innovation for a promising future in the medical industry.]]></description>
										<content:encoded><![CDATA[
<p>In the ever-evolving landscape of modern healthcare, the concept of “Big Data” has emerged as a transformative force, promising a revolution in management and analysis, and ultimately resulting in the improved delivery of medical services. Big Data in healthcare is a mosaic of information, including patient data, clinical notes, diagnostic images, treatment histories, and plenty of other healthcare-related data elements.&nbsp;</p>



<p>This vast information base has the potential to streamline operations and drive innovative advances in patient care. In this article, we embark on a journey through health data, delving deeper into data management, data integration, and the analysis of large volumes of data. And, of course, the promising future it brings for the medical industry.</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#What-is-Big-Data-Analytics-in-Healthcare?--">1.  What is Big Data Analytics in Healthcare?  </a></li>
                    <li><a href="#How-to-use-Big-Data-in-Healthcare">2.  How to use Big Data in Healthcare</a></li>
                    <li><a href="#Data-Management:-Big-Data-Applications-in-Healthcare">3.   Data management: Big Data Applications in Healthcare</a></li>
                    <li><a href="#Challenges-of-Big-Data-in-Healthcare">4.  Challenges of Big Data in Healthcare</a></li>
                    <li><a href="#Benefits-of-Big-Data-in-the-Healthcare-Industry">5.  Benefits of Big Data in the Healthcare Industry</a></li>
                    <li><a href="#Future-Prospects-of-Big-Data-in-Healthcare">6.  Future prospects for Big Data in Healthcare</a></li>
                    <li><a href="#summary">7.  To sum up…</a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="What-is-Big-Data-Analytics-in-Healthcare?--">What is Big Data Analytics in Healthcare? &nbsp;</h2>



<p>Big Data refers to a challenging process of analyzing vast amounts of data sets to identify hidden patterns, market trends, unknown relationships, and customer preferences, enabling companies to make clinical and business decisions. The term is used to describe the vast amounts of information generated by digital technologies that collect data from a variety of information sources, such as electronic patient records (EHRs), test results, diagnoses, medical images, data from smart wearables, as well as healthcare-related demographic and financial data. &nbsp;</p>



<h2 class="wp-block-heading" id="Clinical-data-management-">Clinical data management&nbsp;</h2>



<p>For many years now, collecting large amounts of data for medical purposes has been expensive and time-consuming. The constantly developing technology makes it easier to gather data, create comprehensive medical reports, and then transform them into valuable conclusions, often saving human lives. This is the purpose of data analytics in healthcare: to use data-driven results to predict and solve problems before it&#8217;s too late (predictive analytics), but also to evaluate methods and therapies faster, track changes better, engage patients more in their own healthcare, and equip them with all the tools they need to do so.&nbsp;</p>


</style><div class="promotion-box promotion-box--image-left promotion-box--full-width-without-image"><div class="tiles latest-news-once"><div class="tile"><div class="tile-content"><p class="promotion-box__description2"><strong>Consult your project directly with a specialist</strong></p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Book a meeting</a></div></div></div></div>



<h2 class="wp-block-heading" id="How-to-use-Big-Data-in-Healthcare">How to use Big Data in Healthcare&nbsp;</h2>



<p>The number of resources for healthcare professionals to derive knowledge about their patients is constantly growing. This data is usually in different formats and sizes, which is challenging for the user. Nowadays, however, no one focuses on how “big” this data is, but on how to use data wisely. With the right technologies, data can be quickly and cleverly obtained from sources such as:&nbsp;</p>



<ul class="wp-block-list">
<li><strong>Patient portals </strong></li>



<li><strong>Scientific research </strong></li>



<li><strong>Payment details </strong></li>



<li><strong>Databases for general purposes </strong></li>



<li><strong>Electronic health records (EHR) </strong></li>



<li><strong>Wearable devices</strong> (e.g. smart watches and bands that monitor health parameters, medical equipment such as patient monitors or blood pressure monitors) </li>



<li><strong>Searches </strong></li>
</ul>



<p><strong>Also read</strong><a href="https://nearshore-it.eu/articles/technologies/digital-transformation-in-the-healthcare-industry" target="_blank" rel="noreferrer noopener"><strong>:</strong> Digital transformation in the healthcare industry</a>&nbsp;&nbsp;</p>



<h2 class="wp-block-heading">Examples of Big Data applications in healthcare&nbsp;</h2>



<p>Big Data analytics have numerous capabilities in healthcare, and the list of Big Data uses in healthcare is constantly growing. These solutions can help to improve the quality of care, allow us to optimize processes, and create more effective strategies. The major applications of Big Data in the healthcare industry include:&nbsp;</p>



<ol class="wp-block-list">
<li><strong>Diagnosis and treatment:</strong> Big Data enables more accurate diagnosis of diseases and patients&#8217; treatment plans. Analyzing clinical data, such as test results, medical images, and electronic health record (EHR) data, allows doctors to make faster and more precise clinical decisions. It is also possible to monitor the progress of treatment and adapt the therapy to the individual needs of the patient. </li>



<li><strong>Preventing epidemics:</strong> Big Data makes it possible to monitor and track the spread of infectious diseases. Early case detection and contact tracing can help prevent an epidemic and respond quickly in the event of an outbreak. </li>



<li><strong>Personalized healthcare: </strong>With Big Data, you can create personalized healthcare plans which take the individual patients&#8217; needs into account. This also includes tailoring therapies, diets, and other aspects of care to the specific features of the person being treated. </li>



<li><strong>Optimizing hospital processes:</strong> Hospitals and clinics use Big Data to optimize patient management, the availability of places in hospitals, staff planning, and medicine inventory organization. This helps to improve productivity and reduce costs. </li>



<li><strong>Clinical trials:</strong> Big Data makes it possible to conduct more advanced clinical trials, including the analysis of treatment and drug outcomes on a large number of patients. This speeds up the process of developing new drugs and therapies. </li>



<li><strong>Engaging patients:</strong> By monitoring their health through apps and smart wearables, patients can more actively participate in their own healthcare. This data can then be analyzed to provide patients with tailored advice and information. </li>



<li><strong>Improving health insurance plans:</strong> Health insurance companies use Big Data to assess risk and create personalized insurance plans. This allows them to deliver better service and maintain control over costs. </li>



<li><strong>Better</strong> <strong>quality:</strong> Big Data allows you to monitor the quality of healthcare services and identify areas that need improvement. This helps to safeguard the highest quality of patient care. </li>



<li><strong>Predictive analysis and prevention:</strong> Big Data makes it possible to identify patients who may be at risk of serious diseases in the future. This, in turn, allows you to intervene faster and undertake preventive actions. </li>



<li><strong>Scientific research: </strong>Scientific research and analysis of health trends at the population level, which can lead to new discoveries in medicine, is also possible thanks to Big Data. </li>
</ol>



<h2 class="wp-block-heading">Big Data in biomedical research&nbsp;</h2>



<p>Big Data in biomedical research is a key factor expediting the discovery and development of new therapies, diagnostics, and a better understanding of biological processes. Genome sequencing, gene expression research, and protein analysis are just some examples of the key role Big Data plays. This data allows researchers to identify the genetic root of diseases, detect diagnostic biomarkers, and develop tailored therapies tailored to individual needs. The combination of Big Data and advanced technologies opens the door to a revolution in healthcare and leads us toward a medicine that is more personalized and effective.&nbsp;</p>



<h2 class="wp-block-heading">Internet of Things (IoT)&nbsp;</h2>



<p>The Internet of Things (IoT) is another significant contributor to the transformation of healthcare through Big Data. In the medical environment, IoT means a combination of a variety of devices and sensors that collect medical data and transmit it to central systems. There, it can be analyzed and used to improve medical care.&nbsp;</p>



<p>With ongoing access to data on glucose levels, blood pressure, physical activity, and other health metrics, carers can respond rapidly to changes and adjust therapies. This not only improves patients’ quality of life but also reduces healthcare costs by preventing complications and visits to the hospital. Combined with Big Data, IoT is a powerful tool for personalizing healthcare and making key clinical decisions.&nbsp;</p>



<h2 class="wp-block-heading" id="Data-Management:-Big-Data-Applications-in-Healthcare">Data Management: Big Data Applications in Healthcare&nbsp;</h2>



<p>Data management plays a key role in harnessing the potential of Big Data in healthcare. In a medical environment with a massive amount of data generated, the effective collection, storage, processing, and analysis of information poses a major challenge. It also improves administrative processes, such as bill settlements and management, or arranging the schedules of medical staff. As a result, data management enables doctors, researchers, and medical institutions to take more informed clinical and strategic measures. &nbsp;</p>



<p>Fortunately, there are a number of tools and systems which can help with collecting, processing, analyzing, and managing huge amounts of medical information. Below you will find some of the most popular tools for this industry:&nbsp;</p>



<ol class="wp-block-list">
<li><strong>Patient Data Management Systems (EHR/EMR):</strong> Examples include Epic, Cerner, or Allscripts. These systems facilitate the collection and sharing of patient information (including medical history, test results, and prescriptions). </li>



<li><strong>Databases:</strong> Tools for Database management such as Oracle, Microsoft SQL Server, and MySQL are used to store and organize medical data. </li>



<li><strong>Hadoop:</strong> This is a Big Data framework that can be used for Big Data analytics in healthcare. </li>



<li><strong>Spark:</strong> A tool for processing and analyzing Big Data that is especially useful for analyzing medical data. </li>



<li><strong>Tableau:</strong> A system for creating interactive data visualizations. It makes the analysis and presentation of results easier. </li>



<li><strong>SAS:</strong> Data analysis software that is used in clinical trials and results analysis. </li>



<li><strong>Apache Cassandra:</strong> A solution for real-time storing and managing medical data. </li>



<li><strong>RedCap:</strong> A system for creating databases and managing data in clinical trials. </li>



<li><strong>QlikView:</strong> A data visualization Business Intelligence system that can also help you analyze medical data. </li>



<li><strong>MongoDB:</strong> A NoSQL database that can be employed to store and analyze medical data in an unstructured format. </li>
</ol>



<p>The above systems are only a small number of the available solutions. The decision to choose a particular tool depends on the needs of the medical organization and the type of data that is to be collected and analyzed. &nbsp;</p>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2023/02/FotoInetum_abstract18.jpg" alt="FotoInetum abstract18" title="Big Data in healthcare: management, analysis and future prospects for healthcare organizations 32"></div><div class="tile-content"><p class="entry-title client-name">DATA ANALYTICS SERVICES</p>

<h3>Use data to your advantage</h3>
Discover our Data Management End-to-End offer!
<a class="btn btn-primary" href="https://nearshore-it.eu/modern-data-solutions/" target="_blank" rel="noopener">Find out more!</a></div></div></div></div>



<h2 class="wp-block-heading" id="Challenges-of-Big-Data-in-Healthcare">Challenges of Big Data in Healthcare&nbsp;</h2>



<p>Big Data analysis in healthcare brings both many benefits and challenges, often related to the implementation of Big Data healthcare systems for data collection, data storage, and data sharing. &nbsp;</p>



<p>The first such challenge is <strong>privacy and data security. </strong>Medical data is extremely sensitive and is subject to strict privacy laws. This data must be kept safe, as any breaches can lead to serious legal consequences and loss of patient trust. With this in mind, we move to another challenge: <strong>compliance.</strong> Healthcare is one of the sectors that are subject to many legal and ethical regulations (such as HIPAA in the United States or the GDPR in the European Union). Complying with these regulations is mandatory and may impose restrictions on data analysis and processing. &nbsp;</p>



<p>But that is not all. <strong>The interoperability of systems and data quality are also big challenges.</strong> Many healthcare providers use various EHR systems and other data management tools. Ensuring compatibility and interoperability between these systems is often hard and requires standardization. Multiple data sources can be subject to errors, inaccuracies, and gaps. Incorrect data can lead to wrong conclusions and clinical decisions. So now you can see that in data-driven healthcare management, precision and confidence in data quality is a must. Errors in the analysis or the incorrect interpretation of data can lead to serious consequences for patients.&nbsp;</p>



<p><strong>Scalability, data management, and analysis are</strong> some of the key challenges here. As the amount of medical data increases, it is necessary to provide adequate infrastructure and resources to store, process, and analyze medical data &#8211; not to mention expert knowledge and skills. In this context, a lack of competences can be an additional challenge.&nbsp;</p>



<p><strong>Costs and return on investment can also be a big challenge. </strong> Implementing Big Data analysis in healthcare can be expensive, and organizations need to carefully assess the potential return on investment. Such transformations affect business processes,<strong> the organizational culture</strong>, and the way of thinking so that medical institutions can fully utilize the potential of the solutions implemented.&nbsp;</p>



<p>Despite these difficulties, the potential for Big Data to transform healthcare is huge. The benefits are recognized by many medical organizations striving to address these issues to deliver better healthcare and medical innovation. &nbsp;</p>



<h2 class="wp-block-heading" id="Benefits-of-Big-Data-in-the-Healthcare-Industry">Benefits of Big Data in the Healthcare Industry &nbsp;</h2>



<p>The implementation and use of Big Data brings undeniable benefits, including:&nbsp;</p>



<ol class="wp-block-list">
<li><strong>Cost reduction:</strong> According to Datavant CEO Pete McCabe, smarter use of data could <a href="https://www.mckinsey.com/industries/healthcare/our-insights/how-can-healthcare-unlock-the-power-of-data-connectivity" target="_blank" rel="noreferrer noopener">eliminate up to 75% of unnecessary healthcare costs</a>. By identifying patterns and trends, Big Data analytics can help healthcare organizations discover areas for improvement. Big Data helps optimize the management of medicine inventories, medical equipment, as well as other resources.  </li>



<li> <strong>Improving the quality of care: healthcare personalization, improving chronic disease management, and early disease detection and prevention: </strong>By analyzing clinical outcomes and monitoring quality metrics, Big Data helps identify areas for improvement and raise healthcare standards<strong>. </strong> It can help with patients suffering from chronic diseases, monitor their health, and deliver customized care. </li>



<li><strong>Optimization of clinical processes:</strong> Medical data allows for better management of processes in medical facilities (hospitals, clinics, etc.). This results in shorter waiting times for patients, optimized availability of places in hospitals, and improved resource management. </li>



<li><strong>Research and development of therapies:</strong> Big Data accelerates clinical research and analysis of therapy results. This allows scientists to discover new medications, treatments, and medical novelties. </li>



<li><strong>Epidemics and public health monitoring:</strong> In the event of epidemics and threats to public health, Big Data allows for rapid monitoring and response. This can help control the spread of disease and save lives. </li>



<li><strong>Development of telemedicine:</strong> Big Data supports the development of telemedicine and makes medical consultations available remotely. This is especially important in situations when access to traditional healthcare is limited. </li>



<li><strong>Prevention of fraud and abuse:</strong> With Big Data, you can detect irregularities in billing and payments in health insurance systems. That&#8217;s a big help in preventing fraud. </li>
</ol>



<h2 class="wp-block-heading" id="Future-Prospects-of-Big-Data-in-Healthcare">Future Prospects of Big Data in Healthcare&nbsp;</h2>



<p>The future of Big Data in health organizations and the impact of Big Data is highly promising. We had a first-hand look at the global growth of Big Data in the healthcare market during the COVID-19 pandemic, when it reached a value of $32.9 billion in 2021. From 2022 to 2032, this market is forecasted to expand at the average annual rate (CAGR) of 19.2%, reaching an impressive<strong> value of $94.7 billion </strong>by the end of 2032.&nbsp;</p>



<p>Currently, the healthcare data analysis sector accounts for almost 14.2% of the entire healthcare sector, which is testament to the significant role it plays. Such favorable predictions will also affect the market for Big Data technologies and services; hence, as this technology is adopted in the healthcare sector, it will generate new sources of income. It is also worth mentioning that a significant part of the revenue related to data management in healthcare results from monitoring and managing transactional information between payers and healthcare providers.&nbsp;</p>



<h2 class="wp-block-heading" id="summary">Big Data management – summary&nbsp;</h2>



<p>The future of analyzing large volumes of data in healthcare is promising. Data processing brings a great deal of challenges and benefits for both medical providers and patients. Companies in the sector can count on reducing costs, improving the quality of services, optimizing clinical processes, developing innovative therapies, and monitoring people&#8217;s health. However, to be able to fully use the potential of Big Data, it is worth equipping yourself with the knowledge of experts. &nbsp;</p>



<p>As Inetum, we understand the complexity of customer needs in the healthcare industry, which is why we approach each project individually. If you are interested in using the potential of Big Data in your business or need advice in this area, please contact our specialist: &nbsp;</p>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2024/06/BigCTA_MarekCzachorowski.jpg" alt="BigCTA MarekCzachorowski" title="Big Data in healthcare: management, analysis and future prospects for healthcare organizations 33"></div><div class="tile-content"><p class="entry-title client-name promotion-box__headline2">Elevate Your Data Strategy</p>
<p class="promotion-box__description2">Our customized Data solutions align with your business objectives. Consult with <strong>Marek Czachorowski</strong>, Head of Data and AI Solutions, for expert guidance.</p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Schedule a meeting</a></div></div></div></div>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/technologies/big-data-in-healthcare/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>What is Master Data Management? A Comprehensive Guide to MDM </title>
		<link>https://nearshore-it.eu/articles/master-data-management-what-is-mdm/</link>
					<comments>https://nearshore-it.eu/articles/master-data-management-what-is-mdm/#respond</comments>
		
		<dc:creator><![CDATA[Łukasz Pająk]]></dc:creator>
		<pubDate>Thu, 09 Nov 2023 05:11:00 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Organization]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=26093</guid>

					<description><![CDATA[How can you manage your data efficiently and why should you implement a Master Data Management program? In the article, I will guide you through the MDM process! ]]></description>
										<content:encoded><![CDATA[
<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#Introduction-to-Master-Data-Management-(MDM)-">1.  Introduction to Master Data Management (MDM) </a></li>
                    <li><a href="#Why-manage-Master-Data?-">2.  Understanding Master Data. Why manage Master Data? </a></li>
                    <li><a href="#hat-is-Master-Data-Management">3.  What is Master Data Management (MDM)? </a></li>
                    <li><a href="#Implementing-Master-Data-Management-solutions-">4.  Implementing Master Data Management solutions </a></li>
                    <li><a href="#Consistent-Master-Data-vs-information-flow-–-examples--">5.  Consistent Master Data vs information flow – examples  </a></li>
                    <li><a href="#Best-practices--">6.  Best practices  </a></li>
                    <li><a href="#How-do-we-optimize-the-data-management-process-in-the-organization?--">7.  How do we optimize the data management process in the organization?  </a></li>
                    <li><a href="#How-do-we-determine-the-Master-Data?--">8.  How do we determine the Master Data?  </a></li>
                    <li><a href="#Why-implement-the-MDM-strategy?-">9.  Why implement the MDM strategy? </a></li>
                    <li><a href="#Overview-of-enterprise-Master-Data-Management-solutions--">10.  Overview of enterprise Master Data Management solutions  </a></li>
                    <li><a href="#Which-MDM-solution-is-best-for-my-company?-">11.  Which MDM solution is best for my company? </a></li>
                    <li><a href="#MDM-system-implementation--">12.  Benefits of MDM system implementation  </a></li>
                    <li><a href="#Key-Considerations-for-Successful-Master-Data-Management-">13.  Key Considerations for Successful Master Data Management </a></li>
                    <li><a href="#FAQ-">14.  FAQ</a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="Introduction-to-Master-Data-Management-(MDM)-">Introduction to Master Data Management (MDM)&nbsp;</h2>



<p>In our blog posts, we have already discussed the topics of data quality, data integration, <a href="https://nearshore-it.eu/articles/business-intelligence-outsourcing/" target="_blank" rel="noreferrer noopener">data management</a>, and the visualization of enterprise data. The appropriate approach to data, understanding it, and using it knowledgeably allows many companies to spread their wings, regardless of the industry or business profile. This time I will present enterprise Master Data Management (MDM), i.e. the efficient management of basic data, which is a mandatory point for every <a href="https://nearshore-it.eu/articles/data-driven-managment/" target="_blank" rel="noreferrer noopener">data-driven company</a>.&nbsp;&nbsp;</p>



<p><strong>So how can MDM help? &nbsp;</strong></p>



<h2 class="wp-block-heading" id="Why-manage-Master-Data?-">Understanding Master Data. Why manage Master Data?&nbsp;</h2>



<p>The 21st century has brought lots of data from multiple sources and information that flows through companies every day. And there are no indications that there will be less data in future. Quite the opposite – we should be prepared for the fact that the amount of data will grow, and in a logarithmic way. However, hidden somewhere among all this data is a real treasure that can change the way any organization works – namely master data.<em> </em>&nbsp;</p>



<h2 class="wp-block-heading" id="hat-is-Master-Data-Management">What is Master Data Management (MDM)?&nbsp;</h2>



<p>Or: what do we really want to manage and what should we manage? Do we know which data is most important to our organization?&nbsp;</p>



<p>Master data are key information about individuals, objects, or concepts that are crucial to a company’s operations. In the context of Master Data Management (MDM), master data are information that is considered fundamental and vital to the proper functioning of the business. &nbsp;</p>



<p>Moreover, corporations process many other transactional data, as well as generating analytical data that can be based on Master Data. This includes various aspects of the business and data types, such as:&nbsp;</p>



<ul class="wp-block-list">
<li><strong>clients </strong></li>



<li><strong>products </strong></li>



<li><strong>employees </strong></li>



<li><strong>providers </strong></li>



<li><strong>locations and other relevant issues. </strong></li>
</ul>



<h2 class="wp-block-heading" id="Implementing-Master-Data-Management-solutions-">Implementing Master Data Management solutions&nbsp;</h2>



<p>After this brief introduction, it may already be clear why you should manage data. The purpose of Master Data Management (once it is defined and known to us) is to<strong> ensure the consistency, quality, integrity, and availability of this data in all systems and processes of the organization. </strong>As a result, this data becomes a valuable resource that can be used to make more accurate business decisions, optimize operations, and increase the company&#8217;s competitiveness. MDM also helps prevent errors resulting from inconsistent or outdated information, which can negatively impact effectiveness. &nbsp;</p>



<h2 class="wp-block-heading" id="Consistent-Master-Data-vs-information-flow-–-examples--">Consistent Master Data vs information flow – examples &nbsp;</h2>



<p>Imagine a sales representative who presents an offer to a customer already being handled by another coworker in the department, due to duplicate records in the CRM. Data quality issues including a mess in key data not only reduce the effectiveness of sales activities but also undermine the company&#8217;s reputation. There are numerous examples of this type. Outdated customer data in the database will not allow marketers running mailing campaigns to reach the right person. The lack of key product data in the <strong>ERP system means a risk of real sales losses</strong>. &nbsp;</p>



<h2 class="wp-block-heading" id="Best-practices--">Best practices &nbsp;</h2>



<p>That&#8217;s why considering which data in your organization should be treated as master data is one of the best practices. Managing the most valuable data properly is, among other things, the foundation for building data quality metrics that I discussed in the article on <a href="https://nearshore-it.eu/pl/artykuly/data-governance/" target="_blank" rel="noreferrer noopener">Data Governance. </a>  </p>



<h2 class="wp-block-heading" id="How-do-we-optimize-the-data-management-process-in-the-organization?--">How do we optimize the data management process in the organization? &nbsp;</h2>



<p>Ensuring data consistency and optimizing the data management process in your organization is a multifaceted task that requires taking various factors into account. &nbsp;</p>



<ol class="wp-block-list">
<li><strong>Check what role data plays in achieving your company&#8217;s goal</strong> – first, you need to know and understand the organization&#8217;s goals and what role data plays in accomplishing them.  </li>



<li><strong>Determine the master data</strong> – then decide which data can be considered key for the company.  </li>



<li><strong>Develop good practices</strong> – having completed this process, it is necessary to develop data management standards and procedures that will be implemented throughout the organization. By this, we mean an approach to both presenting and understanding specific concepts.  </li>



<li><strong>Take care of the quality of your data</strong> – in the next steps, it is important to ensure high data quality and eliminate errors and potential duplicates in the creation of master data. The aforementioned data quality metrics will certainly be necessary. </li>
</ol>



<h2 class="wp-block-heading" id="How-do-we-determine-the-Master-Data?--">How do we determine the Master Data? &nbsp;</h2>



<p>At this point, MDM systems come in handy. There are many such providers – I will discuss them in a moment. They allow you to organize everything that is really important in data management in one place, i.e., they are an information center facilitating standardization, synchronization, integration, data access management, and regulatory controls. The latter point is especially important when the law changes and the management of data in the company should be adjusted accordingly (this was the case, for example, at the time the <strong>General Data Protection Regulation (GDPR) </strong>came into force).&nbsp;</p>



<h2 class="wp-block-heading" id="Why-implement-the-MDM-strategy?-">Why implement the MDM strategy?&nbsp;</h2>



<p>With a proper and well-organized approach to master data management, it is much easier to present data to employees. We also minimize the risk of situations in which different corporate systems have to determine master data on their own. This also applies to reporting analyses: when we know where to look for information, e.g., about customers and invoices, we can avoid conflicting results in reports. &nbsp;</p>



<h2 class="wp-block-heading" id="Overview-of-enterprise-Master-Data-Management-solutions--">Overview of enterprise Master Data Management solutions &nbsp;</h2>



<p>There are numerous <strong>Master Data Management (MDM) systems</strong> on the market from different providers. Most often, they offer what is most important in MDM, but of course their providers deliver various functions, and the tools come with many capabilities.&nbsp;</p>



<p>Below are some examples of the most popular MDM solutions:&nbsp;</p>



<ol class="wp-block-list">
<li><strong>Informatica MDM:</strong> Informatica specializes in data management solutions, and Informatica MDM is its Master Data Management platform. It offers a wide range of features, including deduplication, integration, and Data Quality management. </li>



<li><strong>SAP Master Data Governance:</strong> SAP offers MDM solutions that are integrated with the provider&#8217;s ERP systems. SAP Master Data Governance allows companies to manage customer data, products, and other critical data. </li>



<li><strong>IBM InfoSphere MDM:</strong> The InfoSphere solution is a comprehensive data management tool that allows organizations to manage customer, product, and supplier data, among others.  </li>



<li><strong>SAS MDM:</strong> SAS offers tools that allow organizations to manage master data and use it for analysis and reporting. </li>



<li><strong>TIBCO MDM:</strong> TIBCO Software offers a solution that allows you to manage customer data, products, and other key information in your organization. </li>
</ol>



<h2 class="wp-block-heading" id="Which-MDM-solution-is-best-for-my-company?-">Which MDM solution is best for my company?&nbsp;</h2>



<p>These are just a few examples of MDM systems available on the market. The choice of the right system depends mainly on the needs and requirements of the company and on the budget. Each of the aforementioned providers offers a free demo of their tools, so it is worth taking the time to thoroughly familiarize yourself with the capabilities of individual solutions.&nbsp;</p>



<h2 class="wp-block-heading" id="MDM-system-implementation--">Benefits of MDM system implementation&nbsp;&nbsp;</h2>



<ul class="wp-block-list">
<li><strong>Centralized source of master data </strong>– first of all, after implementing the MDM software and correct initial configuration, the company gains a central source of data information, which significantly facilitates analysis and further company growth based on key data. Any integration of new systems, the expansion of current ones, and sometimes even rebuilding them becomes much easier when we know which data we should use and how we should use it. This way, the introduction of a new CRM system does not require analysis related to determining the source of customer information. The data is at your fingertips. </li>



<li><strong>Better results</strong> – without duplicating errors, we get better quality products, which are also created faster, because the risk of incorrect results, and thus subsequent iterations related to improvement, is reduced. This, in turn, directly leads to better competitiveness in the market and also helps to increase trust among business partners, who are served more efficiently.  </li>



<li><strong>Improved analytics</strong> – naturally, all analytical activities gain better flow and allow you to make better business decisions.  </li>



<li><strong>Facilitated offer customization</strong> – the consistent master data that we have under control directly translates into better customer service and allows us to better customize the company&#8217;s offer.  </li>
</ul>



<p><strong>Also read:</strong> <a href="https://nearshore-it.eu/articles/technologies/data-mining-methods/" target="_blank" rel="noreferrer noopener">Data mining techniques</a>&nbsp;</p>



<h2 class="wp-block-heading" id="Key-Considerations-for-Successful-Master-Data-Management-">Key Considerations for Successful Master Data Management&nbsp;</h2>



<p>I hope that this article has drawn attention to the Master Data Management (MDM) field and the fact that the management of master data records is important. So let&#8217;s sum it up in a few sentences. We may remember that a systemic approach to managing master data (e.g. customers or products) guarantees higher-quality services and better results and translates into increased satisfaction both within the company and outside – for our customers. &nbsp;</p>



<p>By ensuring the consistency and uniformity of key data within the main data sources, we accelerate all business processes, prepare for any legal changes, and ensure that strategic decisions are made on the basis of relevant information.&nbsp;</p>



<figure class="wp-block-table"><table><tbody><tr><td><br></style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2023/02/FotoInetum_abstract18.jpg" alt="FotoInetum abstract18" title="What is Master Data Management? A Comprehensive Guide to MDM  34"></div><div class="tile-content"><p class="entry-title client-name">DATA ANALYTICS SERVICES</p>

<h3>Use data to your advantage</h3>
Discover our Data Management End-to-End offer!
<a class="btn btn-primary" href="https://nearshore-it.eu/modern-data-solutions/" target="_blank" rel="noopener">Find out more!</a></div></div></div></div></td></tr></tbody></table></figure>



<h2 class="wp-block-heading" id="FAQ-"><strong>FAQ – common Master Data Management strategy questions </strong>&nbsp;</h2>



<p>Finally, here are a few frequently asked questions and answers that may interest you if you are intrigued by Master Data Management (MDM). &nbsp;</p>


<div id="rank-math-faq" class="rank-math-block">
<div class="rank-math-list ">
<div id="faq-question-1699263481374" class="rank-math-list-item">
<h3 class="rank-math-question ">What does a Master Data Specialist do? </h3>
<div class="rank-math-answer ">

<p>A Master Data Specialist deals with the management of basic or key data in an organization. Their main tasks are data management, deduplication (data cleansing), support in the scope of system integration, data quality monitoring, participation in the development of business processes, as well as cooperation with other departments to strive for a single source of truth. </p>

</div>
</div>
<div id="faq-question-1699263482469" class="rank-math-list-item">
<h3 class="rank-math-question ">What is MDM? </h3>
<div class="rank-math-answer ">

<p>MDM stands for Master Data Management. It is an approach and a set of practices and technologies that organizations use to effectively manage master data. The main goal of MDM is to ensure the consistency, accuracy, and availability of this data throughout the organization, eliminating errors, inconsistencies, and duplicates.  </p>

</div>
</div>
<div id="faq-question-1699263489886" class="rank-math-list-item">
<h3 class="rank-math-question ">Why is it worth organizing master data?  </h3>
<div class="rank-math-answer ">

<p>By ensuring that the basic data in our company has been organized, we safeguard the consistency and accuracy of the information processed. This, in turn, directly translates into fewer errors in the operation of systems and fewer inconsistencies in reporting analyses, so we significantly reduce the risk of generating unnecessary costs. This way, we save not only resources, but also time; this will be appreciated both by employees and our business partners. </p>

</div>
</div>
<div id="faq-question-1699263502556" class="rank-math-list-item">
<h3 class="rank-math-question ">Types of Master Data </h3>
<div class="rank-math-answer ">

<p>From the perspective of Master Data Management, the following types of data most often come to the forefront: <br />  <br />●     <strong>customer data:</strong> name, surname, address, telephone number, e-mail address. <br />●     <strong>product data:</strong> name, description, price, product code, supplier, availability. <br />●     <strong>employee data:</strong> personal data, qualifications, remuneration. <br />●     <strong>supplier details:</strong> contact details, terms of contracts. <br />●     <strong>location data:</strong> addresses, geolocation. <br />  <br />Different departments can use different data, so it is always worth taking care of the right source within MDM. </p>

</div>
</div>
</div>
</div>]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/articles/master-data-management-what-is-mdm/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Introduction to Qlik Sense ODAG: On-Demand App Generation</title>
		<link>https://nearshore-it.eu/technologies/big-data-management-with-odag/</link>
					<comments>https://nearshore-it.eu/technologies/big-data-management-with-odag/#respond</comments>
		
		<dc:creator><![CDATA[Michal Ogonowski]]></dc:creator>
		<pubDate>Thu, 12 Oct 2023 09:46:05 +0000</pubDate>
				<category><![CDATA[Technologies]]></category>
		<category><![CDATA[Articles]]></category>
		<category><![CDATA[Business Intelligence]]></category>
		<guid isPermaLink="false">https://nearshore-it.eu/?p=25576</guid>

					<description><![CDATA[Dealing with the explosive growth of Big Data? Learn how Qlik Sense's On-Demand App Generation can help! In 2017, they introduced a game-changer. Now, with data set sizes quadrupling annually, we're set to hit 120 zettabytes in 2023. Discover how Qlik Sense handles this massive data challenge in our article. Read more!]]></description>
										<content:encoded><![CDATA[
<p>In the world of Big Data, where data sets are growing at an enormous rate, you need effective Big Data Management tools to deal with the challenges of processing huge amounts of information. </p>



<p>To address these challenges, <strong>Qlik Sense introduced the On-Demand App Generation (ODAG)</strong> functionality in the September 2017 version.  Since then, the number of data generated every year has almost quadrupled. In 2023 we expect to reach 120 zettabytes of data! How can we analyze such huge data sets effectively with the tools available? In the article, I examine the capabilities of Qlik Sense, one of the most powerful Big Data technologies.</p>



<div class="table-of-contents">
    <p class="title"></p>
    <ol>
                    <li><a href="#The-Importance-of-Big-Data-Management">1.  The Importance of Big Data Management</a></li>
                    <li><a href="#Big-Data-Management-tools:-Qlik-Sense-ODAG">2.  Big Data Management tools: Qlik Sense ODAG</a></li>
                    <li><a href="#Big-Data-Management-best-practices-–-the-checklist-for-BI-developers">3.  Big Data Management best practices – the checklist for BI developers</a></li>
                    <li><a href="#What-is-ODAG-in-data-analytics?">4.  What is ODAG in data analytics?</a></li>
                    <li><a href="#ODAG---the-key-components-of-a-Big-Data-management-tool">5.  ODAG &#8211; the key components of a Big Data management tool </a></li>
                    <li><a href="#How-to-create-On-Demand-Applications-(ODAG)">6.  How to create On-Demand Applications (ODAG)</a></li>
                    <li><a href="#Use-Case">7.  Use Case</a></li>
                    <li><a href="#Managing-Big-Data-with-ODAG">8.  Managing Big Data with ODAG</a></li>
                    <li><a href="#What-is-the-Qlik-Associative-Big-Data-Index?">9.  What is the Qlik Associative Big Data Index?</a></li>
                    <li><a href="#How-does-the-Qlik-Associative-Big-Data-Index-work?">10.  How does the Qlik Associative Big Data Index work?</a></li>
                    <li><a href="#How-to-address-Big-Data-Management-challenges">11.  Summary</a></li>
            </ol>
</div>


<h2 class="wp-block-heading" id="The-Importance-of-Big-Data-Management">The Importance of Big Data Management</h2>



<p>The world today is dominated by Big Data, huge amounts of data generated in different sectors such as e-commerce, healthcare, science, social media, and more. The challenge most companies and data scientists face is to turn the raw and unstructured data into valuable information and analyze it as effectively as possible. But what data are we talking about? To help picture it, I have listed some interesting facts below.</p>



<ul class="wp-block-list">
<li>Approximately <strong>328.77 million terabytes</strong> of data are created every day</li>



<li>Approximately <strong>120 zettabytes</strong> of data will be generated in 2023</li>



<li>In 2025, <strong>181 zettabytes</strong> of data will be generated</li>



<li>Videos make up <strong>more than half of the data processed</strong> in the world</li>
</ul>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="762" height="400" src="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_wykres.gif" alt="Big Data Management" class="wp-image-25591" title="Introduction to Qlik Sense ODAG: On-Demand App Generation 35"><figcaption class="wp-element-caption"><a href="https://cdn.buttercms.com/output=f:webp/ods4p5fQVmXkFeHFP3Zx" target="_blank" rel="noopener">https://cdn.buttercms.com/output=f:webp/ods4p5fQVmXkFeHFP3Zx</a></figcaption></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h2 class="wp-block-heading" id="Big-Data-Management-tools:-Qlik-Sense-ODAG">Big Data Management tools: Qlik Sense ODAG</h2>



<p>In every industry, we deal with many types of data that we can call Big Data, which can be analyzed with tools such as <a href="https://nearshore-it.eu/articles/technologies/qlik-sense-extensions/" target="_blank" data-type="URL" data-id="https://nearshore-it.eu/articles/technologies/qlik-sense-extensions/" rel="noreferrer noopener">Qlik Sense</a>. There are plenty of examples of Big Data sets; for example, the aforementioned social media data (posts, photos, data on user preferences, behaviors or relationships), or data generated in e-commerce (purchase data, reviews, searches or customer behavior on the website).</p>



<h2 class="wp-block-heading" id="Big-Data-Management-best-practices-–-the-checklist-for-BI-developers">Big Data Management best practices – the checklist for BI developers</h2>



<p>Here are the steps you should consider following to maximize the potential of Qlik Sense when it comes to analyzing such data. The checklist below will be helpful for any BI developer, but I think it will also help business representatives plan their work (e.g. to better familiarize themselves with the process):</p>



<ul class="wp-block-list">
<li>Make sure you have collected all the data you need from <strong>different data sources</strong>.</li>



<li>Take advantage of the connectors offered by tools for Big Data platforms such as Hadoop or Spark.</li>



<li>Design your Qlik Sense applications with ODAG in mind <strong>(an On-Demand App Generator</strong>, which I will discuss in detail later in the article). This includes identifying key dimensions that users wish to analyze and creating appropriate mechanisms for loading data on demand.</li>



<li>Think about what tables and fields are really needed. Avoid complex data models that can slow down performance (try to create a model called a &#8220;star&#8221; or &#8220;snowflake&#8221;). Make sure that the fact tables are as simple as possible, and transfer detailed information to the dimension tables.</li>



<li><strong>Monitor performance</strong> and adjust the architecture for increasing data volume. This may include additional server resources, optimized data models, or application design best practices.</li>



<li><strong>Ensure proper QS configuration</strong> in terms of safety. Define the necessary permission levels to ensure the protection of valuable and sensitive Big Data collections.</li>



<li>Take advantage of additional extensions available in the<strong> Qlik ecosystem</strong>. You can use those built by the community, or more professional extensions, such as the Vizlib library. They offer additional features or integrations to help you get the most out of the platform.</li>



<li>Technology and business needs are constantly changing. Update your Qlik Sense apps regularly, taking new features, data, and user requirements into account.</li>



<li>Don&#8217;t forget training. Users should know how to use the tool, how to interpret visualizations, and how to use ODAG features.</li>
</ul>



<p class="has-text-align-center"><strong>Also read:</strong> <a href="https://nearshore-it.eu/articles/business-intelligence-outsourcing/"><strong>&nbsp;Business Intelligence outsourcing services</strong></a></p>



<h2 class="wp-block-heading" id="What-is-ODAG-in-data-analytics?">What is ODAG in data analytics?</h2>



<p>What do we do when we are working with huge data sets that are too large to be entirely loaded into memory &#8211; for example, if we want to analyze only one day of sales at a hypermarket ? The ODAG functionality mentioned in the introduction comes in handy.</p>



<p><strong>ODAG is one of the Big Data management solutions</strong> available on the market. It is a technique that allows you to dynamically create a Qlik application based on user choices. Thanks to ODAG, only those fragments of data that are needed at a given moment can be loaded and analyzed. So instead of loading the entire database, the user can choose given criteria or filters that will determine the piece of data that is of interest. Then, based on these choices, Qlik Sense generates a new application containing only the data that meets the selected criteria.</p>



<h2 class="wp-block-heading" id="ODAG---the-key-components-of-a-Big-Data-management-tool">ODAG &#8211; the key components of a Big Data management tool</h2>



<p>ODAG is based on three main components:</p>



<ul class="wp-block-list">
<li><strong>Parent App</strong>: the main Qlik application that contains all the available data or, most often, a representative subset of data. Here the user makes initial choices, specifying what data will be needed in the generated application.</li>



<li><strong>On-Demand Filters</strong>: a collection of settings that determines what data is to be loaded into a new application, based on user choices.</li>



<li><strong>On-Demand App</strong>: an application generated dynamically based on user choices in the parent application and filters. It contains only the data that has been specified by the user.</li>
</ul>


</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2024/06/BigCTA_MarekCzachorowski.jpg" alt="BigCTA MarekCzachorowski" title="Introduction to Qlik Sense ODAG: On-Demand App Generation 36"></div><div class="tile-content"><p class="entry-title client-name promotion-box__headline2">Elevate Your Data Strategy</p>
<p class="promotion-box__description2">Our customized Data solutions align with your business objectives. Consult with <strong>Marek Czachorowski</strong>, Head of Data and AI Solutions, for expert guidance.</p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Schedule a meeting</a></div></div></div></div>



<h2 class="wp-block-heading" id="How-to-create-On-Demand-Applications-(ODAG)">How to create On-Demand Applications (ODAG)</h2>



<p>Creating an ODAG application is easy &#8211; it requires a developer to follow only a few steps.</p>



<ol class="wp-block-list">
<li><strong>Creation of a Parent App / Selection App</strong>. This is an application that usually contains filters to narrow down the data and aggregated values from the main data set.</li>
</ol>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img loading="lazy" decoding="async" width="1296" height="621" src="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_2-1296x621.png" alt="Big Data Management" class="wp-image-25585" title="Introduction to Qlik Sense ODAG: On-Demand App Generation 37" srcset="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_2-1296x621.png 1296w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_2-300x144.png 300w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_2-768x368.png 768w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_2-1536x736.png 1536w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_2-495x237.png 495w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_2.png 1920w" sizes="auto, (max-width: 1296px) 100vw, 1296px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p><strong>2. Creating a Template App</strong>,<strong> </strong>which will be complemented with data after applying the selection from the parent application. This application should include the visualizations that users expect to have.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img loading="lazy" decoding="async" width="1296" height="626" src="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_3-1296x626.png" alt="Big Data Management" class="wp-image-25587" title="Introduction to Qlik Sense ODAG: On-Demand App Generation 38" srcset="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_3-1296x626.png 1296w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_3-300x145.png 300w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_3-768x371.png 768w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_3-1536x742.png 1536w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_3-495x239.png 495w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_3.png 1920w" sizes="auto, (max-width: 1296px) 100vw, 1296px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p>3. <strong>In the parent application</strong>, an On-Demand application link should be created. Its role is to generate an application for the user based on the Template App and data narrowed down by selections from the parent application.</p>



<p>4. The last step is to <strong>generate the application</strong> with the &#8220;Generate a new application&#8221; button.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img loading="lazy" decoding="async" width="1296" height="597" src="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_4-1296x597.png" alt="Big Data Management" class="wp-image-25589" title="Introduction to Qlik Sense ODAG: On-Demand App Generation 39" srcset="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_4-1296x597.png 1296w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_4-300x138.png 300w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_4-768x354.png 768w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_4-1536x707.png 1536w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_4-495x228.png 495w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphics_4.png 1920w" sizes="auto, (max-width: 1296px) 100vw, 1296px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<h2 class="wp-block-heading" id="Use-Case">Use Case</h2>



<p>Thanks to the use of ODAG applications, users can focus on specific pieces of huge data sets instead of searching the entire database. Below I present 4 examples of the use of ODAG in Qlik Sense in different industries.</p>



<h3 class="wp-block-heading"><br>Retail Sector and Sales Analysis</h3>



<p>A retail company has long-term sales data from all its stores around the world. Instead of analyzing the entire database, managers want to focus on data from a specific region and period of time. Using ODAG, the manager selects the region and date range he is interested in. Then, based on these criteria, the Qlik Sense application is generated, presenting only the relevant data.</p>



<h3 class="wp-block-heading"><br>Medical analysis and clinical trials</h3>



<p>A hospital needs to analyze patient data to identify patterns in certain diseases. Physicians can use ODAG to <strong>generate reports on a specific diseases</strong>, based on selected criteria (e.g., patient age, gender, location). The application will be generated with only the data of patients meeting these criteria, which will facilitate the analysis and identification of patterns.</p>



<h3 class="wp-block-heading"><br>Market analysis and financial data</h3>



<p>An investment company wants to analyze market data from the last 10 years, but instead of searching the entire database, an analyst wants to focus only on specific segments. Using ODAG, the analyst can select specific market segments and time range. As in the above examples, this will allow you to generate an application containing only those data that are relevant to an accurate analysis.</p>



<h3 class="wp-block-heading">Transport Sectors and Logistics Analysis</h3>



<p>A logistics company wants to optimize its delivery routes based on data from past years. They have hundreds of thousands of records for different routes, drivers, and road conditions. Thanks to ODAG, the planner can focus on selected routes, drivers or time ranges, and the generated Qlik Sense application will provide accurate data for analysis and optimization.<br></p>



<h3 class="wp-block-heading" id="Managing-Big-Data-with-ODAG">Managing Big Data with ODAG</h3>



<p>On-Demand App Generation is the next step in the evolution of data analytics tools. As you can see, this solution allows for a more efficient and flexible use of resources in the world of Big Data. Its most important benefits include:</p>



<p><strong>• Flexibility and scalability</strong></p>



<p>Instead of trying to load huge amounts of data into one application, ODAG allows users to focus on the subset that is most relevant to them at given time. Thanks to this, Qlik Sense works quickly and efficiently even with huge data sets.</p>



<p><strong>• Optimizing resources</strong></p>



<p>Thanks to the on-demand approach, there is no need to constantly process and update all available data. Instead, data is processed only when the user needs it.</p>



<p>• <strong>Interactivity</strong></p>



<p>Users can easily define what data they want to analyze, making Big Data analytics more interactive and directed .</p>



<p><strong>• Better cost management</strong></p>



<p>Big Data processing can be costly, especially when using cloud solutions based on the consumption of resources. With the on-demand approach, it is easier to control costs because you only pay for processing the data that is actually used.</p>



<p><strong>• Improved performance</strong></p>



<p>Thanks to the option to focus on specific data instead of the entire set, loading and analysis time is shorter.</p>



</style><div class="promotion-box promotion-box--image-left "><div class="tiles latest-news-once"><div class="tile"><div class="tile-image"><img decoding="async" src="https://nearshore-it.eu/wp-content/uploads/2023/02/FotoInetum_abstract18.jpg" alt="FotoInetum abstract18" title="Introduction to Qlik Sense ODAG: On-Demand App Generation 40"></div><div class="tile-content"><p class="entry-title client-name">DATA ANALYTICS SERVICES</p>

<h3>Use data to your advantage</h3>
Discover our Data Management End-to-End offer!
<a class="btn btn-primary" href="https://nearshore-it.eu/modern-data-solutions/" target="_blank" rel="noopener">Find out more!</a></div></div></div></div>



<h2 class="wp-block-heading" id="What-is-the-Qlik-Associative-Big-Data-Index?">What is the Qlik Associative Big Data Index?</h2>



<p>You may be wondering which technologies make it possible to work with huge data sets in Qlik Sense and how to further speed up analysis with the use of ODAG. One such feature is the <strong>Qlik Associative Big Data Index (QABDI)</strong>. This is a technology introduced by Qlik to facilitate fast and interactive search and analysis of large data sets. QABDI is a solution aimed at Big Data environments, such as Hadoop or various data warehouse platforms. It allows users to analyze data without having to load it into memory, which is crucial when working with huge amounts of data.</p>



<h2 class="wp-block-heading" id="How-does-the-Qlik-Associative-Big-Data-Index-work?">How does the Qlik Associative Big Data Index work?</h2>



<p><strong>QABDI creates an associative index from huge data sets in source Big Data systems </strong>. This index is similar to the index in traditional databases, but it is adapted to the associative Qlik model, which allows you to search and analyze data faster.</p>



<p>When the user makes a selection or creates a query in the Qlik application, the system searches the QABDI index, instead of employing traditional data loading and analyzing methods. Thanks to this, responses to queries are delivered faster, even for very large data sets.</p>



<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="756" height="201" src="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphic_1.png" alt="Big Data Management" class="wp-image-25593" title="Introduction to Qlik Sense ODAG: On-Demand App Generation 41" srcset="https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphic_1.png 756w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphic_1-300x80.png 300w, https://nearshore-it.eu/wp-content/uploads/2023/10/nearshore_2023.10.12_graphic_1-495x132.png 495w" sizes="auto, (max-width: 756px) 100vw, 756px" /></figure>
</div>


<div style="height:30px" aria-hidden="true" class="wp-block-spacer"></div>



<p>Remember that users can still use the Qlik interface in the traditional way (<strong>make selections</strong>, <strong>search data</strong> and <strong>create visualizations</strong>). However, thanks to QABDI, these operations are carried out directly on large data sets, without the need for prior processing or aggregation.</p>



<p>For more complex analyses where detailed data is needed, Qlik can automatically move from using QABDI to directly requesting detailed data in the source Big Data system . QABDI is also designed with scalability in mind. Thanks to this, as the amount of data grows, you can easily adjust the infrastructure and resources to ensure good performance at a continual level.</p>



<h2 class="wp-block-heading" id="How-to-address-Big-Data-Management-challenges">How to address Big Data Management challenges</h2>



<p>ODAG or QABDI are powerful tools in the Qlik Sense arsenal that give users dynamic access to huge data sets. This makes it possible to conduct effective analysis and draw conclusions without the need to burden the system with unnecessary data, allowing analysts and managers to make informed decisions based on accurate and up-to-date information. In a world where data is no longer measured in terabytes but in zettabytes, this approach will help you save time and money.</p>


</style><div class="promotion-box promotion-box--image-left promotion-box--full-width-without-image"><div class="tiles latest-news-once"><div class="tile"><div class="tile-content"><p class="promotion-box__description2"><strong>Consult your project directly with a specialist</strong></p>
<a class="btn btn-primary booking" href="https://outlook.office365.com/book/BookameetingwithMarek@gfi.fr/" target="_blank" rel="noopener">Book a meeting</a></div></div></div></div>
]]></content:encoded>
					
					<wfw:commentRss>https://nearshore-it.eu/technologies/big-data-management-with-odag/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
