Cognitive M&A – Leveraging Cognitive Search & Analytics for Successful Mergers and Acquisitions

Mergers and acquisitions provide one avenue for organizations to grow via synergistic gains, strategic positioning and diversification. Even with an abundance of M&A activity, mergers tend to fail at the business process and information integration levels. The success of a merger can be greatly enhanced when business processes are integrated and information is seamlessly unified by gathering it from both organizations, analyzing it, establishing clusters of semantically similar information, and finding common patterns. Cognitive search & analytics platforms provide the necessary capabilities to accomplish all of this, thereby helping facilitate merger and acquisition initiatives and significantly increasing the odds of success.

ANATOMY OF A SUCCESSFUL MERGER

Let’s envision the details of how this impacts the relevant stakeholders. At the outset, a cognitive search & analytics platform provides the organization with unified access to information from both organizations and beyond. Users can leverage out of the box machine learning algorithms to explore and navigate this information. For example, the Clustering algorithm groups documents into topically-related clusters by analyzing the content and metadata. This is very useful for topical navigation and helps stakeholders identify similar documents based on named entities within the content. Automated classification is another useful technique for unifying information and improving navigability. In certain circumstances such as when classification rules do not exist but a properly classified sample set of content does, a Classification by Example algorithm can automatically create a model from the sample set, which can subsequently be applied across the combined set of content from both organizations to further enhance findability for stakeholders.

Sinequa CollaborationMachine learning algorithms can also help match experts with other experts as well as relevant documents across the consolidating organizations. This is done dynamically by analyzing what people write and collaborate around to compute user profiles, which are subsequently analyzed to compute “peer groups” that connect stakeholders with similar interests and expertise across the consolidating enterprise. With these peer groups established, experts can be more effectively presented with relevant content using a collaborative filtering technique that compares preferred content across the peer group and surfaces valuable content to members of the peer group who have not previously been exposed to it. As you can see, a cognitive search & analytics solution facilitates smart information sharing across the consolidating enterprise. Usually a lack of sophisticated security controls impedes greater openness between consolidating entities. A search-based application, however, respects existing security profiles—making it easier to merge infrastructures securely.

A cognitive search & analytics solution also helps to identify areas of risk and to solve outstanding issues before financial consequences occur. For example, risks could include content containing Personally Identifiable Information (PII) or content with no security associated. This is done by employing text-mining agents (TMAs), which provide out-of-the-box rules-based capabilities to extract elements from unstructured text. TMAs can be configured to incorporate terms and phrases specific to any part of the business. A cognitive search & analytics solution enables a quick, seamless and successful consolidation of organizations. Typically, in a large enterprise this is done as a series of search-based applications (SBAs) that each pull from a Logical Data Warehouse (LDW), which is essentially central cache of unified information.  In the next sections, we will look at specific areas of the business that typically benefit the most from this approach.

SALES AND MARKETING 

Once consolidation is underway, the organization must move quickly to combine sales and marketing activities, sales methodologies, pipelines and channels to drive revenue in the field and promote up-selling and cross-selling into new and existing market segments. The organization wants to minimize any potential lapse in the sales cycle for the newly merged company.

A cognitive search & analytics solution immediately equips sales teams with a single global access point to relevant, real-time and insightful information on products and customers—sales and customer notes, sales processes, product information and sales training are all immediately accessible. As previously mentioned, this is typically done using a dedicated search-based application (SBA).

An SBA for Sales and Marketing could provide unified access to the separate CRM systems and allow for the addition of shared content. As a result, no sales note, customer quirk or prospect is lost during consolidation. An SBA also offers the ability to push alerts and notifications out to users.  As sales representatives learn about new products,

The SBA provides unified access to new marketing materials, sales process documentation, research documents and news articles to assist in training and ensures that they are effectively representing the newly expanded product line. Often when a merger occurs, customers know more about the products from the other company than the sales reps. With a cognitive search & analytics solution in place, an empowered and unified sales team can competently sell the acquired products and services.

FINANCE, ACCOUNTING, AND HUMAN RESOURCES

Finance, accounting and human resources are other key departments that need to be unified. A dedicated SBA provides a complete and consolidated access to information from the disparate ERP systems.

A cognitive search & analytics solution provides administrators and content curators with the visibility across the enterprise necessary to manage all documentation from both parties related to the merger effectively and securely.

Multiple IT systems and finance systems always increase the complexity of a merger.  Organizations acquiring a large IT infrastructure need to identify the systems acquired and the value of the data in these systems.

A cognitive search & analytics solution enables the required visibility and helps to expose any redundant or unused systems that might be eliminated. For example, a cognitive search & analytics solution can be used to monitor and analyze usage of underlying repositories and applications to show what sources are being used less frequently. Some organizations have even been able to standardize and eliminate the need for multiple search applications. In other cases, a cognitive search & analytics solution works as a stop gap providing access to legacy systems that should be migrated, thereby reducing the cost of forcing an immediate migration.

Organizations can leverage Sinequa’s scalable platform as the cognitive search & analytics solution to connect information across the consolidating enterprise by leveraging previously mentioned machine learning algorithms like Clustering, Classification by Example, etc. as well as more traditional rules-based enrichment techniques. This helps the organization deliver unified access to stakeholders, ensure accuracy, reduce risk and gain insight into the complex systems across all affected departments. Enterprise Application Integration is usually a multi-year project. While cognitive search & analytics solutions do not replace such an integration, they provide comprehensive visibility in a very short time.

RESEARCH AND DEVELOPMENT

The cost of R&D increases when multiple teams unknowingly work on solutions to the same problems or fail to recognize and utilize the work done in past research. Cognitive search & analytics solutions positively impact R&D by accurately combining research data from the consolidating organizations, giving users real-time access and reducing the duplication of content, efforts, and sometimes entire projects. They do this by connecting experts working in the same subject area.

Sometimes a key driver in a merger or acquisition is gaining access to intellectual property, which often includes the expertise of the other organization’s knowledge workers. Sinequa’s Find the Expert capability gives employees from each organization the ability to discover the most knowledgeable people on a variety of topics, to view their profiles and to find associated information. This accelerates R&D discovery by enabling users to navigate information in clusters—leveraging the machine learning algorithm—and by content refinements.

These tools enable users to find past research and hidden relationships – including relationships with external experts with whom both companies have collaborated in the past – that would have otherwise been missed, thereby increasing speed-to-market.

The organization is also able to gain greater market share by leveraging and optimizing the information acquired instead of simply discarding projects in progress. One of the key drivers in a merger is the ability to retain and share as much knowledge as possible. Sinequa’s collaborative features spawn greater innovation in the newly expanded R&D department in the form of capabilities such as tagging, bookmarking and automatic feedback loops. For example, a scientist can comment on an old publication and explain how it relates to new research, which subsequently increases go-to-market speed by enabling broader collaboration.

HOW IT WORKS

Sinequa has developed an innovative and simple-to-use Cognitive Search and Analytics platform that offers Unified Information Access (i.e. information from any source, in any format, whether structured or unstructured, internal or external, through a single platform and user interface) to respond to even the most difficult information access challenges of large companies and organizations. The solution is composed of three main components:

  • A powerful (natural language) analytics platform that gives structure to the most unstructured content. The analysis and indexing of data solve typical enterprise challenges involving multiple sources, unstructured and structured content, multiple languages, high data volume and high update frequencies.
  • Simple and intuitive user interface that brings simplicity to complex search and analytics. Sinequa’s out-of-the-box user interface leverages the familiarity of common search tools enhanced with faceted search tools (i.e. filters) and analytics, giving the user an immediate picture of the information available from all enterprise sources. The UI is easily customizable to reflect specific appearance and navigation goals as well as corporate standards/branding, and is extensible, allowing third-party or custom UIs to be easily integrated via a provided search API (Application Programming Interface).
  • A powerful, open and scalable technical architecture (GRID). Sinequa’s highly-scalable architecture was designed from the ground up to support multiple search needs starting on a single server cluster, providing a cost-effective solution that allows companies to respond to multiple business needs now and in the future with minimal hardware investment (whether the application is deployed on premise or in a private cloud). The architecture is distributed and modular to support virtually any data source addition and growth, often without any additional infrastructure investment.

Leveraging these combined components from within the Sinequa platform has proven to accelerate the integration of business processes and to amplify the expertise of the whole by seamlessly and securely unifying disparate information from each organization involved in a merger or acquisition.

CONCLUSION

A cognitive search & analytics platform facilitates information transparency and communication across the entire enterprise, minimizing disruptions while integrating teams and departments during mergers and acquisitions. Applying this technology as a solution effectively operationalizes the information access, speed and control needed to ensure long-term success in the newly formed organization.

+1Share on LinkedInShare on Twitter

Gartner Named Sinequa a Leader in Its Magic Quadrant for Insight Engines

As the CEO of Sinequa, I am proud that Sinequa was recognized as a leader in the recently released Magic Quadrant for Insight Engines 2017. Being a Gartner Leader, once again, underlines our continued progress that has led to this renewed leadership position in a Gartner Magic Quadrant. (We have previously been positioned as a leader in the Magic Quadrant for Enterprise Search.)  Gartner selects leaders for their “Completeness of Vision” and their “Ability to Execute” Good to see that others find our vision convincing and believe in our ability to realize it!

More reassuring still is the testimonial of our customers that led Gartner to state that “reference customers regarded Sinequa’s roadmap and future vision for its software to be particularly attractive. All indicated that those were significant reasons for choosing the software.”

As an established Cognitive Search platform, we’re continuing to evolve our vision and invest in enabling the largest organizations such as Airbus, AstraZeneca, Bristol Myers Squibb, Credit Agricole, and Siemens around the globe to get more value from their ever growing and diverse Enterprise data, as well as broadening the impact of search and analytics within the digital workplace of their employees.

According to Gartner:

“Insight engines apply relevancy methods to describe, discover, organize and analyze data. This allows existing or synthesized information to be delivered proactively or interactively, and in the context of digital workers, customers or constituents at timely business moments.”

Gartner-Magic-Quadrant-for-Insight-Engines-2017-Sinequa

Get your copy of the full report here and see why Sinequa is among the 3 leaders over the 13 vendors who participated in this Magic Quadrant.

+1Share on LinkedInShare on Twitter

Machine Learning Becomes Legit, but Not Mainstream in 2017

ML-Sinequa-Predictions-2017

There has been a lot of hype around machine learning lately. Over the past decades, we’ve heard about various concepts around machine intelligence that in most cases didn’t get anywhere. But more and more frequently, organizations are learning how to bring together all the ingredients needed to leverage machine learning, and there is a simple reason for that: according to Moore’s law, the performance of microprocessors has increased since 1980 be a factor of more than 16 million! A program that ran on a 1980 computer for more than half a year today delivers its results in one second!

That is why I think Machine Learning will be the story for 2017. We’ll see it move from a mystical, over-hyped holy grail, to more real-world, successful applications. Those who dismiss it as hocus-pocus will finally understand it’s real; those who distrust it will come to see its potential; and companies that apply ML to appropriate use cases will achieve real business benefit without the high cost of entry that was common in years past. In 2017 it will be clear that it has a credible place in the business toolkit.

The four necessary enablers for machine learning – huge parallel processing resources, cheap storage, large and appropriate data sets, and accessible machine learning algorithms – are all now mainstream. Most large organizations have readily-available access to all these components (appropriate data sets are potentially the only open question, as they are always business- and use-case-specific), which makes machine learning a real possibility to reduce risk, increase customer satisfaction and loyalty, create new business models, identify patterns, and optimize complex systems.

One area where machine learning is growing rapidly and already showing success is for cognitive search and analytics applications. It won’t take over core algorithms anytime soon, but ML is already supplementing and enhancing search results based on user actions and smart analysis of content.

I don’t foresee machine learning achieving “mainstream” status in 2017, but it will within the next few years because the technology is advancing exponentially, quickly enabling its use in broader contexts.

For more on my complete prediction on machine learning, check out this article in Virtual Strategy Magazine.

 

+1Share on LinkedInShare on Twitter

Artificial Intelligence in 2017: Expands Capabilities, but Impacts the Workforce

Artificial-Intelligence-SinequaThe beginning of the new year is a good time to reflect on the events of 2016 and on their forebodings for the coming year and beyond.There has no doubt been a great deal of buzz around artificial intelligence (AI) this year. However, it’s difficult to sort through what’s hype and what’s not to determine where these technologies will actually take us in 2017. While we know the trend will continue in some form, what will be new or different next year? Here are some of my predictions: 

Artificial Intelligence is taking the industry by storm, and not just in “Westworld.” We’re entering a new phase of AI thanks to advances in computing power and volume of data. This has opened the door to solve computational problems on a scale that no human mind could approach – even in a lifetime. The result is that computers are now able to provide responses that aren’t dictated by a collection of “if A, then B” rules, offering results that can only be explained by saying that the computer “understands.” The benefit is that complex and time-consuming cognitive processes can now be automated, and we can do things at scale that were previously impossible because unlike humans, computers are not overwhelmed by volume.

We’re definitely headed in the direction of workforce displacement and I believe it’s going to happen quickly, as there are huge economic incentives to increase efficiency and to automate manual tasks. This will happen faster than we expect because we think linearly, while technology is advancing exponentially. We struggle with that perspective because it quickly outpaces what we can readily grasp, whether that be in size or speed, or both. This will bring additional challenges because the disruption will occur across the occupational spectrum (unlike the industrial revolution, which primarily impacted “low-skill” jobs). I don’t see any particular sector being hit by this tidal wave in 2017, but AI is a disruptor like we’ve never seen before and it will be here soon whether we are ready for it or not.

However, with this transformation, tasks that have been impractical because of the time/labor involved now become feasible, which means we’ll be able to do things we haven’t been able to do before. It will also free us from many mundane and repetitive tasks, enabling people to focus on new or more valuable activities. This will increase efficiency in the workplace as well as consistency, which will improve quality and safety. So while the workforce will look very different from how it looks today – certainly in 10 years and probably in five, AI and ML are going to greatly extend and expand our capabilities in ways that, for now, we can only imagine.

What are your predictions for 2017 and beyond? For a full list of my predictions on AI other topics such as machine learning and big data, check out my post in VMblog.

+1Share on LinkedInShare on Twitter

5 Ways Machine Learning Makes Your Search Cognitive

5 Ways Machine Learning Makes Your Search Cognitive

Artificial intelligence, machine learning, deep learning, cognitive computing…no doubt there is a lot of buzz out there but quite a bit of confusion too in terms of expectations and pre-requisites. We often hear customers and prospects say: “I want an AI assistant that tells me what to expect and what to do next,” or “It takes a lot of time to train an AI assistant to make him an expert in my field, doesn’t it? I want something that fits in my budget and I want it now.” Also, users often have many questions regarding the potential of machine learning for end users in their work environment. In this blog post, I’m sharing our initial thoughts about machine learning algorithms and how they empower cognitive search and analytics platforms to deliver better insights in relevant work context.

Machine learning algorithms often operate in two phases: the learning phase and the model application phase. In the learning phase, the data is analyzed iteratively to extract a model from manually classified data. While in the model application phase, the extracted model is applied to further inputs to predict a result.

Machine learning algorithms depend strongly on the quality of data, which is correlated to the quality of results. Cognitive search and analytics platforms can use natural language processing (NLP) and other analytics to enrich structured and unstructured data from different sources (entity extraction, detection of relationships within the data, etc.). This “data pre-processing” stage allows machine learning algorithms to start from enriched data and deliver relevant results much faster. These results continuously enrich the index/logical data warehouse and thus make it easier to answer users’ queries in real-time.

A performant cognitive search and analytics platform must integrate machine learning algorithms with its NLP and other analytics capabilities to deliver the most intelligent and relevant search results to users. Below are five ways machine learning makes search cognitive:

  • Classification by example – a supervised learning algorithm used to extract rules (create a model) to predict labels for new data given a training set composed of pre-labeled data. For example, in bioinformatics, we can classify proteins according to their structures and/or sequences. In medicine, classification can be used to predict the type of a tumor to determine if it’s harmful or not. Marketers can also use classification by example algorithms to help them predict if customers will respond to a promotional campaign by analyzing how they reacted to similar campaigns in the past.
  • Clustering – an unsupervised learning algorithm whereby we aim to group subsets of documents by similarity. Sinequa uses clustering when we don’t necessarily want to run a search query on the whole index. The idea is to limit our search to a specific group of documents in each cluster. Unlike classification, the groups are not known beforehand, making this an unsupervised task. Clustering is often used for exploratory analysis. For example, marketing professionals can use clustering to discover different groups in their customer/prospect database and use these insights to develop targeted marketing campaigns. In the case of pharmaceutical research, we can cluster R&D project reports based on similar drugs, diseases, molecules and/or side effects cited in these reports.
  • Regression – a supervised algorithm that predicts continuous numeric values from data by learning the relationship between input and output variables. For example, in the financial world, regression is used to predict stock prices according to the influence of factors like economic growth, trends or demographics. Regression can also be used to create applications that predict traffic-flow conditions depending on the weather.
  • Similarity – not a machine learning algorithm but simply a heavy computing process that helps build a matrix synthesizing the interaction of each sample of data with another one. This process often serves as a basis for the algorithms cited above, and can be used to identify similarities between people in a given group. For example, pharmaceutical R&D can rely on similarity applications to constitute worldwide teams of experts for a research project based on their skills and their footprints in previous research reports and/or scientific publications.
  • Recommendation –  one of the various use cases consists of merging several basic algorithms to create a recommendation engine proposing contents that might be of interest to users. This is called “content-based recommendation,” which offers personalized recommendations to users by matching their interest with the description and attributes of documents.

All the algorithms above need to be executed in a fast and scalable computing environment to deliver the most precise results. Currently, the Spark distributed computing platform offers the most powerful capabilities to execute machine learning algorithms efficiently. It is indeed designed to scale up from single servers to thousands of machines and it runs much faster than simple Hadoop frameworks.

Our recent contribution in the KM World Whitepaper “Best Practices in Cognitive Computing” highlights concrete use cases, describing how cognitive information systems are capable of extracting relevant information from big and diverse data sets for users in their work context. Get your copy here.

+1Share on LinkedInShare on Twitter