Taking too long? Close loading screen.
Connect with us

Tech

Organizations keep messing up their data insights — here’s how to fix it

Published

on

The current economic downturn and global disruption from the pandemic created a “digital awakening.” Boards and C-level executives are accelerating digital transformation initiatives to drive efficiency, growth, business resiliency, and remain relevant and competitive in the new digital normal. Now, to make effective decisions these executives need data. To make effective decisions in a timely manner these executives also need to automate the manner they capture, process, analyze and draw insights. 

So, if we all agree that timely access to rich insights from data is the holy grail then the obvious question we’re left with is: “what do I need to reach this state of digital nirvana?”

In this piece, I’ll take a hypothesis-based approach to focused outcomes, then back into the type of technology that can help executives achieve the value badly needed to navigate the current crisis while positioning for the rebound.

Most of us likely agree that insights from data is extremely beneficial, in the immediate and long term. Then why do so many organizations still struggle to capture, visualize, understand, and optimize business-critical information from the moment it flows in?

What’s actually going on?

It’s not that industries don’t understand the value of data, particularly as artificial intelligence (AI) and augmented analytics have gained traction globally. 60% of CIOs say that data and analytics will affect their businesses in the next three years, and 73% of companies are planning to invest in DataOps initiatives to support AI and machine learning initiatives.

But to leverage any of these advanced analytics technologies, organizations first need to capture the data. This includes structured data, including websites, business and desktop applications, and databases. But even more important is unstructured data, since these accounts require more overall data than structured. Unstructured data is the content found in documents and emails, for instance. 

Once we’ve ingested 100% of the available structured and unstructured data, the data must be orchestrated into appropriate workflows to feed downstream systems.

The typical organization will use several disparate applications to process a single transaction. And in many organizations humans serve as the ‘connective tissue’ among these disparate applications. This is expensive, takes time and is prone to error. 

Let’s look at a typical customer onboarding process. This likely requires an initial digital channel feeding a CRM triggering credit checks, bounces off a decision-engine to initiate Know-Your-Customer (KYC) actions, sending notifications to the customer providing updates and requesting documentation while another application’s partitioning the user account. 

To make this happen seamlessly, an automated end-to-end digital customer workflow can be designed to orchestrate the flow of data, eliminate errors, reduce cycle times and increase compliance.

Automation can also accelerate the generation of data insights by rapidly aggregating different types of data — comprising business data, and data from different channels and sources, including operational and processing data, customer data, customer or stakeholder feedback, etc. The result is a more accurate and faster way to visualize business intelligence. 

Finally, every organization wants to maximize the value of their data. A great approach is to “open it up” a bit through data democratization using the right tools — empowering employees to spend more time engaging with and exploring the data to gain insights they can use in their roles. 

Data analytics can change the game

Banks, insurance companies, transportation and logistics firms, healthcare companies, government agencies and more — every day, an avalanche of data pours into organizations across every industry. This data comprises a variety of formats and originates from multiple sources.

A number of organizations still struggle with information silos, inefficient legacy infrastructure, and uncaptured and/or unstructured data. But increasing numbers are now successfully automating data capture and transformation into the right format.

Lingering headaches that are common are high error rates from the merging of information entering the organization, poor documentation and different rules requirements. All of this can contribute to production or service delays and, ultimately, frustrated users.

This is where analytics capabilities part of a larger, integrated intelligent automation solution can be game-changing. Automation and workflow create a new ‘digital frontier’ removing friction resulting in higher efficiency levels, but when analytics are added we ultimately enhance decision-making.

The business value of integrated intelligent automation and analytics is enhanced oversight of key business processes, streamlining workflows, pinpointing the likelihood of bottlenecks or service interruptions before they happen and speeding the delivery of critical business data and decisions.

Automation helps you make the most of your data

The most elementary application of automation to your data can provide answers to questions including: is the data we’re capturing showing increased errors, and if so, why? Can we isolate user productivity by department and determine where extra training could be beneficial? Are our workflows processing data outside of our SLAs? If so, what’s the surplus?

Analytics is about uncovering patterns, particularly unanticipated ones, and helping organizations use those real-time insights to act upon that information quickly and proactively — and predict future potential issues.

Unfortunately, according to IDC, only 10% of usable data is used for analysis. It’s true organizations have become quite adept at data collection, even very large amounts. But many struggle with transforming the massive amounts of information acquired through intelligent automation into something understandable and actionable so they can make better business decisions.

This brings us to essential tools every user tasked with maximizing the value of organizational data should have. It starts with self-service, interactive, web-based dashboards and visualizations that don’t require IT to build new reports, modify queries, or perform coding, syntax or scripting.

Below is a list of the most important key dashboard configuration capabilities. Users should be able to:

  • Uncover real-time process trends through data visualization and manipulation to improve performance
  • Drill down to identify low-confidence fields and set the right actions to improve process quality
  • Filter batch and document numbers by class and pinpoint bottlenecks
  • Evaluate productivity statistics teams — number of batches, documents and pages processed per time variable, team or process
  • Use predefined operations metrics to quickly view and accurately measure and improve performance
  • Get objective performance monitoring of human operators, business processes, and software performance
  • Use a mobile or desktop view from any device
  • Protect data on-the-fly and apply security roles to protect data
  • Drill down to the lowest level of process data to account for, track and trace processes

Final thoughts

Many organizations are leaving a great deal of insights on the table when it comes to their data. There’s never been a better time to move beyond the boundaries of traditional systems and redefine what analytics means as digital transformation accelerates in the ‘new normal’. 

Integrated intelligent automation connecting systems, data, and people to achieve outcomes combined with powerful business intelligence is the key. The ability of organizations to garner real-time insights from this activity is the path toward realizing the agility and resiliency successful organizations need to thrive today and in the long term.

Published October 7, 2020 — 08:00 UTC

Source

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Fire Emblem: Shadow Dragon and the Blade of Light is getting its first English release

Published

on

Fire Emblem: Shadow Dragon and the Blade of Light is launching in the US for the first time ever. The tactical roleplaying game, originally released in 1990, will be available on the Nintendo Switch for $5.99 on December 4th.

As the first game in the Fire Emblem series, Shadow Dragon and the Blade of Light stars Marth, a character best known Stateside for his appearance in Super Smash Bros. The Switch edition of the game will include fast-forward, rewind, and save state features.

It’s important to note that the release is for a limited time only, until the franchise’s 30th anniversary on March 31st, 2021. Shadow Dragon and the Blade of Light isn’t the first game to adopt such a strategy; rather, it appears to be building on a Disney Vault-type play on Nintendo’s part. Super Mario 3D World’s availability is also set to expire on March 31st of next year.

An anniversary edition — which includes a stylized physical NES box and a replica NES Game Pak art piece, in addition to an art book and download code — will be available for $49.99 at select retailers.

Source

Continue Reading

Tech

Republican lawmakers are furious after Twitter asks users to read stories before retweeting

Published

on

House Judiciary Committee Republicans and committee member Rep. Doug Collins (R-GA) are spreading a misleading claim about a new Twitter feature that asks users to read articles before retweeting them. Earlier this year, Twitter started testing a prompt to discourage knee-jerk retweets. It appears on links across the entire service, but Republican lawmakers have cited individual warnings on right-leaning articles as the latest of many censorship accusations.

Twitter announced last month that it would roll out the feature across its mobile apps, describing it as a way to “help promote informed discussion.” When you hit the retweet button on a link you haven’t visited, Twitter adds a label above the confirmation menu, warning that “headlines don’t tell the full story” and offering a chance to check the story out.

This is optional; you can ignore it and simply confirm the retweet if you want, and it doesn’t add any extra taps. But some conservative Twitter users expressed fury at the warning. Former PJ Media editor David Steinberg claimed that Twitter “placed a headline warning label” on a Wall Street Journal article about Republican congressional candidate Kimberly Klacik, saying the prompt “should disturb every American.” The label appears if you try to retweet many other WSJ articles on a variety of topics as well as stories from The Verge and other media outlets.

The claim was amplified by Republican members of Congress. Collins claimed that Twitter was “censoring” all tweets from Sean Hannity, citing labels on links to Hannity.com. The Twitter account for Judiciary Committee Republicans made a similar claim about a Hannity article, insinuating that Twitter had specifically added the warning to a story about allegedly leaked emails from Hunter Biden.

Twitter’s communications team tweeted a somewhat exasperated response. “We’re doing this to encourage everyone to read news articles before Tweeting them, regardless of the publication or the article,” a spokesperson wrote. “If you want to retweet or quote tweet it, literally just click once more.”

It’s not necessarily surprising that Twitter’s new feature would raise hackles since it comes on the heels of two unpopular Twitter decisions. Twitter blocked a link to New York Post articles about Hunter Biden’s emails last week, citing a ban on “hacked content,” before apologizing and changing its policy. It also started temporarily asking users to quote tweets instead of retweeting them, another attempt to encourage more engagement. Today, a Senate committee approved subpoenas for Twitter CEO Jack Dorsey and Facebook CEO Mark Zuckerberg, calling them to testify about restricting the Post story’s reach.

Twitter doesn’t seem to apply the warning to every link either, and that’s caused some confusion online. As the National Republican Senatorial Committee noted, you can retweet links to Democratic fundraising platform ActBlue without a warning. However, we also received no warning when retweeting a link to Republican equivalent WinRed. We’ve asked Twitter for more clarification on when the label appears. But whatever its answer, the feature is far more widespread than these lawmakers suggest.

Source

Continue Reading

Tech

Google Maps launches a new developer solution for on-demand ride and delivery companies

Published

on

The Google Maps Platform, the developer side of Google Maps, is launching a new service for on-demand rides and delivery companies today that ties together some of the platform’s existing capabilities with new features for finding nearby drivers and sharing trip and order progress information with customers.

This isn’t Google Maps Platform’s first foray into this business. Back in 2018, the company launched a solution for in-app navigation for ridesharing companies, for example. At the time, the team didn’t really focus on delivery solutions, though, but that’s obviously one of the few booming markets right now, thanks to the COVID-19 pandemic.

“Building on 15 years of experience mapping the world, the On-demand Rides & Deliveries solution helps businesses improve operations as well as transform the driver and customer journey from booking to arrival or delivery–all with predictable pricing per completed trip,” Google senior product manager Eli Danziger writes in today’s announcement.”

At the core of the service is the Google Maps routing service, which developers can tweak for deliveries by bike or motorcycle, for example, and to find optimized routes with the shortest or fastest path. The team notes that this so-called ‘Routes Preferred’ feature also enables arrival time predictions for time-sensitive deliveries and pricing estimates.

The other new feature of this platform is to enable developers to quickly build an experience that helps users find nearby drivers. Imaginatively called ‘Nearby Drivers,’ the idea here is about as straightforward as you can imagine and allows developers to find the closest driver with a single API call. They can also add custom rankings, based on their specific needs, to ensure the right driver is matched to the right route.

Unsurprisingly, the platform also features support for in-app navigation, and that’s tied in closely with the rest of the feature set.

Developers can also easily integrate Google’s real-time trip and order progress capabilities to “keep customers informed from pickup to drop-off or delivery, with a real-time view of a driver’s current position, route, and ETA.”

All of this is pretty much what any user would expect from a modern ride-sharing or delivery app, so for the most part, that’s table stakes. The technology behind it is not, though, and a lot of delivery companies have set up large tech operations to build out exactly these features. They aren’t likely to switch to Google’s platform, but the platform may give smaller players a chance to operate more efficiently or enter new markets without the added expense of having to build this tech stack from the ground up — or cobble it together from multiple vendors.

Source

Continue Reading

Trending