Staff Pick Archives » Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop https://solutionsreview.com/data-integration/category/staff-pick/ Data Integration Buyers Guide and Best Practices Thu, 16 Jan 2025 13:12:11 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 https://solutionsreview.com/data-integration/files/2024/01/cropped-android-chrome-512x512-1-32x32.png Staff Pick Archives » Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop https://solutionsreview.com/data-integration/category/staff-pick/ 32 32 The 6 Best Data Engineering and Big Data Udacity Nanodegrees for 2025 https://solutionsreview.com/data-integration/the-best-udacity-nanodegrees-for-data-engineering-and-big-data/ Wed, 01 Jan 2025 21:17:42 +0000 https://solutionsreview.com/data-integration/?p=4291 The editors at Solutions Review compiled this list of the best data engineering and big data Udacity Nanodegree certifications to consider taking this year. Data engineering is the process of designing and building pipelines that transport and transform data into a usable state for data workers to utilize. Data pipelines commonly take data from many […]

The post The 6 Best Data Engineering and Big Data Udacity Nanodegrees for 2025 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
The Best Udacity Nanodegrees for Data Engineering

The editors at Solutions Review compiled this list of the best data engineering and big data Udacity Nanodegree certifications to consider taking this year.

SR FindsData engineering is the process of designing and building pipelines that transport and transform data into a usable state for data workers to utilize. Data pipelines commonly take data from many disparate sources and collect them into data warehouses that represent the data as a single source. To do so, data engineers must manipulate and analyze data from each system as a pre-processing step.

With this in mind, the editors at Solutions Review have compiled this list of the best data engineering and big data Udacity Nanodegree certifications, Udacity is perfect for those looking to take multiple courses or acquire skills in multiple different areas, or for those who want the most in-depth experience possible through access to entire course libraries or learning paths. In sum, Udacity is home to more than 160,000 students in more than 190 countries.

Note: The best Udacity Nanodegrees for Data Engineering are listed in alphabetical order.

Download Link to Data Integration Buyer's Guide

The Best Udacity Nanodegrees for Data Engineering and Big Data

TITLE: Become a Data Architect Nanodegree

OUR TAKE: Learn how to plan, design and implement enterprise data structure solutions and create blueprints. This nanodegree program takes 4 months to finish.

Description: In this program, you’ll plan, design and implement enterprise data infrastructure solutions and create the blueprints for an organization’s data management system. You’ll create a relational database with PostGreSQL, design an Online Analytical Processing (OLAP) data model to build a cloud-based data warehouse, and design scalable data lake architecture that meets the needs of big data. Finally, you’ll learn how to apply the principles of data governance to an organization’s data management system.

GO TO TRAINING

TITLE: Become a Data Engineer Nanodegree

OUR TAKE: This module will show you how to build a production-ready data infrastructure. This training will take you 5 months to complete, and intermediate Python and SQL skills are recommended.

Description: Learn to design data models, build data warehouses and data lakes, automate data pipelines, and work with massive datasets. At the end of the program, you’ll combine your new skills by completing a capstone project. Students will learn to create relational and NoSQL data models to fit the diverse needs of data consumers, as well as sharpen your data warehousing skills and deepen your understanding of data infrastructure.

GO TO TRAINING

TITLE: Become a Data Product Manager

OUR TAKE: This Nanodegree program will teach techniques for evaluating the data from live products, including how to design and execute various A/B and multivariate tests to shape the next iteration of a product.

Description: Learn how to apply data science techniques, data engineering processes, and market experimentation tests to deliver customized product experiences. Begin by leveraging the power of SQL and Tableau to inform product strategy. Then, develop data pipelines and warehousing strategies that prepare data collected from a product for robust analysis.

GO TO TRAINING

TITLE: Data Engineering with Microsoft Azure

OUR TAKE: Master the job-ready skills you need to succeed as a Microsoft Azure data engineer like designing data models and utilizing other in-demand components of the cloud computing service.

Description: Learners will acquire the skills needed to design data models, create data pipelines, and navigate large datasets on the Azure platform. Additionally, they will learn to build data warehouses, data lakes, and lakehouse architecture.

GO TO TRAINING

TITLE: Data Streaming Nanodegree

OUR TAKE: This program will help you learn the skills to advance your career in data engineering. Students will also build real-time applications for processing big data and compile data run analytics.

Description: Learn how to process data in real-time by building fluency in modern data engineering tools, such as Apache Spark, Kafka, Spark Streaming, and Kafka Streaming. You’ll start by understanding the components of data streaming systems. You’ll then build a real-time analytics application.

GO TO TRAINING

TITLE: Learn SQL Nanodegree

OUR TAKE: This Udacity Nanodegree will help you learn big data’s core language for big data analysis, SQL. Students should come prepared with a basic understanding of data types and plan to spend 2 months on this  training.

Description: Perform analysis on data stored in relational and non-relational database systems to power strategic decision-making. Learn to determine, create, and execute SQL and NoSQL queries that manipulate and dissect large-scale datasets. Begin by leveraging the power of SQL commands, functions, and data cleaning methodologies to join, aggregate, and clean tables, as well as complete performance tune analysis to provide strategic business recommendations.

GO TO TRAINING

Download Link to Data Integration Vendor Map

Solutions Review participates in affiliate programs. We may make a small commission from products purchased through this resource.

The post The 6 Best Data Engineering and Big Data Udacity Nanodegrees for 2025 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
What to Expect at Safe Software’s The Peak of Data Integration 2023 on September 5-7 https://solutionsreview.com/data-integration/what-to-expect-at-safe-softwares-the-peak-of-data-integration-2023-on-september-5-7/ Fri, 25 Aug 2023 15:38:16 +0000 https://solutionsreview.com/data-integration/?p=5787 Safe Software’s The Peak of Data Integration 2023 will offer learning and networking opportunities and hands-on training with FME. What is The Peak of Data Integration? With over 75 sessions and real-life stories from FME users worldwide, you won’t want to miss this opportunity to make connections and learn amongst the best in data integration […]

The post What to Expect at Safe Software’s The Peak of Data Integration 2023 on September 5-7 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
What to Expect at Safe Software's The Peak of Data Integration 2023 on September 5-7

Safe Software’s The Peak of Data Integration 2023 will offer learning and networking opportunities and hands-on training with FME.

What is The Peak of Data Integration?

With over 75 sessions and real-life stories from FME users worldwide, you won’t want to miss this opportunity to make connections and learn amongst the best in data integration who are conquering data mountains with FME. FME helps organizations connect data and applications across systems, streamline operations with automated workflows, and deliver speed to insights. Safe Software serves customers in diverse industries.

Why You Should Attend

Join Safe Software (and Solutions Review!) at the Peak of Data Integration 2023, the largest gathering of FME users on earth. The event, which takes place from September 5-7 in Bonn, Germany, will offer learning and networking opportunities and hands-on training with FME.

About Safe Software

Safe Software 106Safe Software is a major player in enterprise spatial data integration and helps global organizations maximize the value of their data. The company offers FME, an enterprise integration platform. FME helps organizations connect data and applications across systems, streamline operations with automated workflows, and deliver speed to insights. Safe Software serves customers in diverse industries, including Government, Utilities, Energy, AEC, Telecom, and Transportation.

FAQ

  • What: Safe Software: The Peak of Data Integration 2023
  • When: September 5-7, 2023
  • Where: Bonn, Germany (see registration page for more detail)

Register for The Peak of Data Integration 2023

The post What to Expect at Safe Software’s The Peak of Data Integration 2023 on September 5-7 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
What to Expect at Denodo DataFest Americas 2023 on September 12-14 https://solutionsreview.com/data-integration/what-to-expect-at-denodo-datafest-americas-2023-on-september-12-14/ Mon, 07 Aug 2023 15:37:30 +0000 https://solutionsreview.com/data-integration/?p=5751 Denodo DataFest 2023 will empower users to maximize their knowledge of Denodo and help increase ROI from data and analytics tools, models, and applications. What is a Denodo DataFest? Join your peers and other industry leaders at Denodo DataFest 2023 and explore the latest innovations in logical data management with Denodo. Exchange expertise and stories […]

The post What to Expect at Denodo DataFest Americas 2023 on September 12-14 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
What to Expect at Denodo DataFest Americas 2023 on September 12-14

Denodo DataFest 2023 will empower users to maximize their knowledge of Denodo and help increase ROI from data and analytics tools, models, and applications.

What is a Denodo DataFest?

Join your peers and other industry leaders at Denodo DataFest 2023 and explore the latest innovations in logical data management with Denodo. Exchange expertise and stories experienced in the logical data management ecosystem. Hear firsthand from leading technology leaders, including data scientists, data engineers, cloud architects, enterprise architects, and business leaders, about their experiences in architecting and building modern data infrastructures.

Why You Should Attend

Denodo DataFest 2023 is designed to empower customers to maximize their knowledge of the Denodo Platform and to help their organizations increase their ROI from data and analytics tools, models, and applications.

This event offers opportunities to learn about best practices and tips for leveraging the Denodo Platform’s logical approach for cloud modernization, self-service analytics, data science and machine learning (ML), data fabric, data mesh, and more.

  • Learn from fellow data practitioners by swapping strategies and sharing
    best practices
  • Build confidence and expertise in creating leading-edge logical data solutions to address your organization’s data integration and data management needs
  • Network with like-minded industry professionals who are using the Denodo Platform to conquer the challenges that come with data that is deeply buried, complex to access, difficult to integrate, and inaccessible to business users

Speakers

About Denodo

Denodo 106The award-winning Denodo Platform offers a robust capabilities package for data integration, data management, and data delivery using a logical approach to enable self-service business intelligence, data science, hybrid/multi-cloud integration, and enterprise data services. Denodo touts customers across large enterprises and mid-market companies in over 30 industries. A pioneering company in the data virtualization space, Denodo was founded in Palo Alto, California, in 1999.

FAQ

  • What: Denodo DataFest 2023
  • When: September 12-14, 2023
  • Where: Boston, MA (see registration page for more detail)

Register for Denodo DataFest 2023

The post What to Expect at Denodo DataFest Americas 2023 on September 12-14 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Safe Software’s FME:23 Bringing Life to Data: Event Live Blog https://solutionsreview.com/data-integration/safe-softwares-fme23-bringing-life-to-data-event-live-blog/ Thu, 13 Apr 2023 16:36:42 +0000 https://solutionsreview.com/data-integration/?p=5302 Safe Software’s FME:23 rebrand launch event marks the start of a new era in spatial data integration. Safe Software Co-Founders Don and Dale will unveil the first-ever debut of the firm’s corporate rebrand as well as share sneak peaks on the upcoming FME release. Our editors will live blog all the news, tips, and insights […]

The post Safe Software’s FME:23 Bringing Life to Data: Event Live Blog appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Safe Software's FME:23 Bringing Life to Data: Event Live Blog

Safe Software’s FME:23 rebrand launch event marks the start of a new era in spatial data integration. Safe Software Co-Founders Don and Dale will unveil the first-ever debut of the firm’s corporate rebrand as well as share sneak peaks on the upcoming FME release. Our editors will live blog all the news, tips, and insights in this space.

Who is Safe Software?

Safe Software is a major player in enterprise spatial data integration and has helped many global organizations maximize the value of their data. The company offers FME, an enterprise integration platform with comprehensive support for spatial data. FME helps organizations connect data and applications across systems, streamline operations with automated workflows, and deliver speed to insights. Safe Software serves customers in diverse industries, including Government, Utilities, Energy, AEC, Telecom, and Transportation.

Next, Safe Software will dive into the powerful new features of FME:23 — showcasing how they can enhance your workflows, simplify data transformations, and drive real-world business applications. Safe will share inspiring customer success stories, demonstrating how FME can revolutionize the way you work with data. Don and Dale will also provide insight into the future of FME, giving you a glimpse into what’s to come.

Coverage begins April 13 at 1:00 PM EST, and here’s what to expect.

Download Link to Data Integration Buyer's Guide

14/13/2022, 2:16 PM

Tim King: That’s an event! Thanks for tuning in to our live coverage of Safe Software’s FME:23: Bringing Life to Data.

14/13/2022, 2:12 PM

Tim King: Q&A time.

14/13/2022, 2:06 PM

Tim King:

14/13/2022, 2:00 PM

Tim King:

14/13/2022, 1:59 PM

Tim King: New features for FME:23 are expected to roll out “in the next four weeks.”

14/13/2022, 1:56 PM

Tim King: Safe Software announces enhanced support for Databricks.

14/13/2022, 1:43 PM

Tim King:

14/13/2022, 1:33 PM

Tim King: Don and Dale begin discussing the FME: 23 product release.

14/13/2022, 1:29 PM

Tim King:

14/13/2022, 1:27 PM

Tim King: Safe Software unveils its new logo, corporate brand, and website.

14/13/2022, 1:22 PM

Tim King:

14/13/2022, 1:15 PM

Tim King:

14/13/2022, 1:12 PM

Tim King:

14/13/2022, 1:00 PM

Tim King: We’re live! Safe Software Co-Founders Don and Dale start the show by talking about the company’s history.

14/13/2022, 12:50 PM

Tim King:

14/13/2022, 12:43 PM

Tim King: In addition to following along here (for the best access to links and embedded media), you can also watch the event virtual stream here.

14/13/2022, 12:39 PM

Tim King:

14/13/2022, 12:36 PM

Tim King: Solutions Review is set to cover this exclusive Safe Software FME:23 event and we hope you’ll join us!

The post Safe Software’s FME:23 Bringing Life to Data: Event Live Blog appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
What to Expect at Safe Software’s FME:23 Event on April 13 https://solutionsreview.com/data-integration/what-to-expect-at-safe-softwares-fme23-event-on-april-13/ Mon, 13 Mar 2023 18:51:07 +0000 https://solutionsreview.com/data-integration/?p=5243 Safe Software’s FME:23 rebrand launch event marks the start of a new era in spatial data integration. Safe Software Co-Founders Don and Dale will unveil the first-ever debut of the firm’s corporate rebrand as well as share sneak peaks on the upcoming FME release. Who is Safe Software? Safe Software is a major player in […]

The post What to Expect at Safe Software’s FME:23 Event on April 13 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
What to Expect at Safe Software's FME:23 Event on April 13

Safe Software’s FME:23 rebrand launch event marks the start of a new era in spatial data integration. Safe Software Co-Founders Don and Dale will unveil the first-ever debut of the firm’s corporate rebrand as well as share sneak peaks on the upcoming FME release.

Who is Safe Software?

Safe Software 106Safe Software is a major player in enterprise spatial data integration and has helped many global organizations maximize the value of their data. The company offers FME, an enterprise integration platform with comprehensive support for spatial data. FME helps organizations connect data and applications across systems, streamline operations with automated workflows, and deliver speed to insights. Safe Software serves customers in diverse industries, including Government, Utilities, Energy, AEC, Telecom, and Transportation.

What is the FME:23 Product Launch?

This event will feature Safe Software’s Co-Founders, Don and Dale, who discuss the latest product innovations driven by market direction. Attendees will witness the first-ever external debut of Safe Software’s corporate rebrand as well.

Next, Safe Software will dive into the powerful new features of FME:23 — showcasing how they can enhance your workflows, simplify data transformations, and drive real-world business applications. Safe will share inspiring customer success stories, demonstrating how FME can revolutionize the way you work with data. Don and Dale will also provide insight into the future of FME, giving you a glimpse into what’s to come.

Why (Virtually) Attend?

The free virtual event consists of a keynote by Don and Dale to share the company’s product vision and direction, the new identity, and a spotlight customer success story on how FME can revolutionize how you work with data and transform the enterprise. There will also be product innovations, future development, and a live Q&A panel.

FAQ

What: Safe Software’s FME:23 Product Launch & Corporate Rebrand

When: Thursday, April 13, 2023 from 1:00 PM to 2:30 PM EST (virtual)

Where: Virtual, free and open to all (see registration page for more details)

Register for Safe Software’s FME:23 Product Launch FREE

The post What to Expect at Safe Software’s FME:23 Event on April 13 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Solutions Review Names 7 Low-Code Data Engineering Vendors to Watch, 2023 https://solutionsreview.com/data-integration/solutions-review-names-low-code-data-engineering-vendors-to-watch-2023/ Fri, 03 Feb 2023 18:55:59 +0000 https://solutionsreview.com/data-integration/?p=5190 Solutions Review’s Low-Code Data Engineering Vendors to Watch is an annual listing of solution providers we believe are worth monitoring. Companies are commonly included if they demonstrate a product roadmap aligning with our meta-analysis of the marketplace. Other criteria include recent and significant funding, talent acquisition, a disruptive or innovative new technology or product, or […]

The post Solutions Review Names 7 Low-Code Data Engineering Vendors to Watch, 2023 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Solutions Review Names Data Engineering Vendors to Watch, 2023

Solutions Review’s Low-Code Data Engineering Vendors to Watch is an annual listing of solution providers we believe are worth monitoring. Companies are commonly included if they demonstrate a product roadmap aligning with our meta-analysis of the marketplace. Other criteria include recent and significant funding, talent acquisition, a disruptive or innovative new technology or product, or inclusion in a major analyst publication.

Low-code data engineering is a software development approach that enables data engineers to automate and streamline data workflows and pipelines with minimal coding required. It leverages visual interfaces and pre-built components to make it easier for users to perform data engineering tasks, such as data extraction, transformation, loading (ETL), data management and transformation, and data monitoring and troubleshooting.

The goal of low-code data engineering is to make data engineering accessible to a wider range of users, including business analysts, data scientists, and developers, who may not have specialized technical skills. This approach helps to speed up the development process, reduce errors, and make it easier to maintain and update data workflows over time.

These low-code data engineering Vendors to Watch have met at least two of our five points of inclusion and represent to some degree the evolution of the marketplace. It’s in that spirit we turn our attention to the immediate future. Providers are listed in alphabetical order. Provider names and logos are linked so you can learn more.

Download Link to Data Integration Buyer's Guide

Ab Initio 

Ab Initio provides a unique data processing platform.  It can process complex enterprise data using sophisticated business logic at any scale. The applications are built graphically and can run within a batch, streaming, in-memory, or microservice architecture. At runtime, the platform adapts to changes in the data formats, rules, and infrastructure, providing a new level of agility. The platform manages all these technical and operational foundations so that system architects, analysts, and developers can focus on the important parts.

Alteryx

Alteryx is a self-service data analytics software company that specializes in data preparation and data blending. Alteryx Analytics allows users to organize, clean, and analyze data in a repeatable workflow. Business analysts find this tool particularly useful for connecting to and cleansing data from data warehouses, cloud applications, spreadsheets and other sources. The platform features tools to run a variety of analytic jobs (predictive, statistical, spatial) inside a single interface.

Datameer

Datameer offers a data analytics lifecycle and engineering platform that covers ingestion, data preparation, exploration, and consumption. The product features more than 70 source connectors to ingest structured, semi-structured, and unstructured data. Users can directly upload data or use unique data links to pull data on demand. Datameer’s intuitive and interactive spreadsheet-style interface lets you transform, blend and enrich complex data toward the creation of data pipelines.

Decodable

Decodable offers a real-time data engineering service for developers and data engineers. The product lets producers of data can serve data directly to the consumer with no intermediary, and data consumers can tailor the data to their needs without any back and forth. Built on the proven stream processing capabilities of Apache Flink, Decodable fully manages the underlying operations, providing engineers with a simple, easy-to-use development and implementation environment.

Fivetran

Fivetran 106Fivetran is an automated data integration platform that delivers ready-to-use connectors, transformations and analytics templates that adapt as schemas and APIs change. The product can sync data from cloud applications, databases, and event logs. Integrations are built for analysts who need data centralized but don’t want to spend time maintaining their own pipelines or ETL systems. Fivetran is easy to deploy, scalable, and offers some of the best security features of any provider in the space.

Prophecy

Prophecy.io is a cloud-native data engineering product built on Apache Spark and Kubernetes. They help enterprise organizations to modernize their data engineering via open runtimes and hybrid cloud deployments. Prophecy does this by automatically migrating existing ETL workflows to Apache Spark and providing a full-stack data engineering product on cloud infrastructure.

Talend

Talend offers an expansive portfolio of data integration and data management tools. The company’s flagship tool, Open Studio for Data Integration, is available via a free, open-source license. Talend Integration Cloud is offered in three separate editions (SaaS, hybrid, elastic) and provides broad connectivity, built-in data quality, and native code generation to support big data technologies. Big data components and connectors include Hadoop, NoSQL, MapReduce, Spark, machine learning, and IoT.

Download Link to Data Integration Buyer's Guide

The post Solutions Review Names 7 Low-Code Data Engineering Vendors to Watch, 2023 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
10 Top Data Engineering Best Practices Generated by ChatGPT https://solutionsreview.com/data-integration/top-data-engineering-best-practices-generated-by-chatgpt/ Mon, 30 Jan 2023 16:23:37 +0000 https://solutionsreview.com/data-integration/?p=5175 Solutions Review analytics and data management lead Tim King asked ChatGPT for the top data engineering best practices listed here. We reviewed it for clarity, edited it for grammar, and listed it below. Editor’s notes appear like this to show which questions we asked the AI. We started with What is data engineering? Data engineering […]

The post 10 Top Data Engineering Best Practices Generated by ChatGPT appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Top Data Engineering Best Practices by ChatGPT

Solutions Review analytics and data management lead Tim King asked ChatGPT for the top data engineering best practices listed here. We reviewed it for clarity, edited it for grammar, and listed it below. Editor’s notes appear like this to show which questions we asked the AI. We started with What is data engineering?

Data engineering is the process of designing, building, and maintaining the infrastructure to store, process, and analyze large amounts of data.

Data engineers work with data architects and data scientists to design and build data pipelines, data warehouses, and other data infrastructure. They are responsible for ensuring that data is collected, stored, and processed in a way that meets the needs of the organization.

Data engineering involves a combination of software engineering and data management skills, and requires knowledge of data storage technologies, data processing frameworks, and data security and privacy. Data engineers use a variety of tools and technologies, including Hadoop, Spark, and cloud data platforms, to build and maintain data infrastructure.

Data engineering is a critical component of data-driven organizations, as it provides the foundation for data-driven decision making and analytics. By designing and building robust data infrastructure, data engineers enable organizations to leverage data to make informed decisions and drive business value.

What are the top data engineering best practices?

Here are some best practices for data engineering:

Download Link to Data Integration Buyer's Guide

Top Data Engineering Best Practices by ChatGPT


Proper Data Modeling and Design

What is data modeling and design?

Data modeling and design is the process of creating a blueprint or a conceptual representation of data, relationships, and constraints in a system. It involves defining entities, attributes, and relationships between them to create a logical model of the data. The purpose of data modeling is to provide a common understanding of the data among stakeholders, to ensure data consistency, and to facilitate the design of a physical database. Good data modeling practices help to improve data quality, maintainability, scalability, and performance of the data systems.

Data Quality Management

What is data quality management?

Data quality management is the practice of ensuring that data is accurate, complete, consistent, and trustworthy. This involves a set of processes and techniques used to assess, monitor, and improve the quality of data, including data validation, data cleansing, data enrichment, data standardization, and data reconciliation. Data quality management is important because poor data quality can lead to incorrect business decisions, loss of revenue and reputation, and decreased operational efficiency. Effective data quality management requires a systematic approach involving the participation of all stakeholders and a focus on continuous improvement.

Data Pipeline Automation

What is data pipeline automation?

Data pipeline automation is the process of automating the flow of data from one or more sources to one or more destinations. This involves the use of software tools and scripts to automate the extraction, transformation, and loading (ETL) of data from various sources, such as databases, APIs, or file systems, into a data storage solution, such as a data warehouse or a data lake. The goal of data pipeline automation is to reduce manual intervention and human error, to improve data quality and reliability, and to facilitate the timely delivery of data to stakeholders. Data pipeline automation can also help to increase the efficiency and scalability of data processing and to reduce the time and effort required to maintain and update data pipelines.

Scalable Infrastructure

What is scalable infrastructure?

Scalable infrastructure refers to a system architecture that can accommodate growth and change in capacity and performance requirements. This involves the design and implementation of systems that can handle increasing amounts of data, users, and processing demands, without sacrificing performance, reliability, or stability. Scalable infrastructure is important for data-driven organizations, as it allows them to respond to changing business needs, handle unexpected spikes in demand, and support future growth. A scalable infrastructure typically includes a combination of hardware, software, and network components that can be added, removed, or reconfigured as needed to meet changing demands. Examples of scalable infrastructure include cloud computing, distributed systems, and modular architectures.

Security and Privacy

What is data security and privacy?

Data security refers to the protection of data from unauthorized access, theft, damage, or destruction. It involves the use of technical and organizational measures to secure data and prevent data breaches, hacking, and other security threats. Data security is important to ensure the confidentiality, integrity, and availability of data and to protect sensitive information and personal data.

Data privacy refers to the protection of personal data and the rights of individuals to control how their data is collected, used, and shared. This involves complying with privacy laws and regulations, such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), and implementing appropriate privacy policies and controls. Data privacy is important to protect the privacy rights of individuals, to maintain consumer trust, and reduce the risk of privacy breaches and data misuse.

Both data security and privacy are essential components of data management and must be integrated into data management practices and systems to ensure the protection of data and the rights of individuals.

Documentation and Version Control

What is document version and control?

Document version control is a process for managing changes to documents and other information so that you can track and control the evolution of the information over time. This allows multiple people to collaborate on a document and to view, compare, and revert to previous versions of the document if necessary.

In a version control system, each change to a document is recorded, along with a description of the change and the identity of the person who made it. This creates a history of all changes, making it easier to understand how the document has evolved, and to revert to previous versions if necessary.

Version control is commonly used for software development but can also be applied to other types of information, such as design documents, configuration files, and databases. The benefits of version control include improved collaboration, increased efficiency, and better management of the information and its history.

Monitoring and Logging

What is data monitoring and logging?

Data monitoring and logging are processes used to monitor the performance and behavior of data systems and to collect and store information about the data and its usage.

Data monitoring involves tracking the performance of data systems in real-time, such as data processing times, disk usage, network performance, and system resource utilization. This allows you to detect and diagnose performance issues and identify trends and patterns in data usage.

Data logging, on the other hand, involves collecting and storing information about data and system events, such as data changes, error messages, and system alerts. This information can be used to diagnose issues, to track data usage patterns, and provide an auditable trail of data and system events.

Data monitoring and logging are important for ensuring the reliability, performance, and security of data systems. By collecting and analyzing data about system performance and behavior, you can detect and resolve issues quickly and ensure that data is being used and processed correctly.

Error Handling and Recovery

What is error handling and recovery?

Error handling and recovery refer to the process of detecting, addressing, and recovering from errors and failures that occur in data systems.

Error handling involves the detection of errors and failures in data systems and implementing mechanisms to handle these errors in a controlled and predictable manner. This involves the design and implementation of error-handling routines, such as exception handling, and the use of error codes and messages to communicate the nature of the error.

Data recovery refers to restoring data systems to a functional state after a failure or error has occurred. This involves using backup and recovery strategies, such as disaster recovery plans and data backups, to ensure that data can be restored in the event of a failure or disaster.

Both error handling and recovery are critical components of data management, as they help to ensure the reliability, availability, and recoverability of data systems and to minimize the impact of errors and failures on business operations. By implementing robust error handling and recovery strategies, you can ensure that data systems continue to function, even in the event of an error or failure.

Team Collaboration and Communication

What is team collaboration and communication?

Team collaboration and communication are processes that facilitate effective and efficient teamwork and communication between team members.

Team collaboration involves the use of tools, processes, and methodologies to support teamwork and cooperation between team members. This includes the use of collaborative tools, such as project management software, and the implementation of teamwork best practices, such as agile methodologies.

Communication is the exchange of information and ideas between team members, and is critical to the success of team collaboration. Effective communication involves the use of clear and concise language, active listening, and the use of appropriate communication tools and methods.

Both team collaboration and communication are important for ensuring the success of data projects and initiatives, as they facilitate coordination and cooperation between team members, and ensure that everyone is on the same page. By fostering strong collaboration and communication practices, you can improve team performance, reduce misunderstandings and errors, and increase the efficiency and effectiveness of data projects.

Continuous Integration and Delivery

What is continuous integration and delivery?

Continuous integration (CI) and continuous delivery (CD) are software development practices that aim to automate the process of building, testing, and deploying software.

CI is the practice of regularly integrating code changes into a shared repository, and automating the build and testing process. This allows developers to detect and resolve issues early in the development cycle, and to ensure that code changes are consistent with the overall codebase.

CD, on the other hand, is the practice of automating the process of delivering code changes to production, by automatically building, testing, and deploying code changes to production systems. This enables faster and more reliable software delivery, and reduces the risk of errors and failures in production.

CI and CD are key components of DevOps, and are used to streamline and optimize the software development and delivery process. By automating the build, testing, and deployment process, you can reduce the time and effort required to deliver software changes, and improve the reliability and quality of software releases.

These practices help to ensure the reliability, efficiency, and scalability of your data pipelines and systems.

Download Link to Data Integration Vendor Map

This article on top data engineering best practices was AI-generated by ChatGPT and edited by Solutions Review editors.

The post 10 Top Data Engineering Best Practices Generated by ChatGPT appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Solutions Review Releases 2023 Buyer’s Guide for Data Integration Tools https://solutionsreview.com/data-integration/solutions-review-releases-2023-buyers-guide-for-data-integration-tools/ Thu, 15 Dec 2022 13:00:35 +0000 https://solutionsreview.com/data-integration/?p=5077 Solutions Review is announcing the release of its new 2023 Buyer’s Guide for Data Integration Tools in an attempt to help organizations select the best software. Solutions Review today is releasing its newly updated 2023 Buyer’s Guide for Data Integration Tools for data, application, and cloud integration to assist organizations during the research and discovery […]

The post Solutions Review Releases 2023 Buyer’s Guide for Data Integration Tools appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Solutions Review Releases 2023 Buyer's Guide for Data Integration Tools

Solutions Review is announcing the release of its new 2023 Buyer’s Guide for Data Integration Tools in an attempt to help organizations select the best software.

Solutions Review today is releasing its newly updated 2023 Buyer’s Guide for Data Integration Tools for data, application, and cloud integration to assist organizations during the research and discovery phase of buying business software. Gathered via a meta-analysis of available online materials, Solutions Review editors compile each Buyer’s Guide via research, analyst reports, conversations with subject matter experts and vendor representatives, and examining product demonstrations.

Top providers highlighted include: 1Spatial, Adeptia, Boomi, Celigo, Cleo, Denodo, Equalum, Fivetran, Hitachi Vantara, IBM, Informatica, Jitterbit, Keboola, Matillion, Microsoft, MuleSoft, Oracle, Precisely, Qlik, Safe Software, SAP, SAS, SnapLogic, Striim, Talend, TIBCO, Tray.io, and Trifacta.

Solutions Review is a collection of business software news and resource sites that aggregates, curates, and creates the leading content to connect buyers and sellers. Over the past six years, Solutions Review has launched 20 distinct and category-specific Buyer’s Guide sites for technologies ranging from Cybersecurity to Big Data and WorkTech, as well as Identity and Access Management, Endpoint Security, Data Analytics and Data Management, Enterprise Resource Planning and Business Process Management, and Enterprise Cloud and Network Monitoring.

In addition to the 2023 Buyer’s Guide for Data Integration Tools, readers are encouraged to view updated research in other related Solutions Review coverage areas:

Solutions Review also drastically increased its presence in the virtual events space in 2022. Headlined by the Solution Spotlight series and our popular Demo Days, Solutions Review is emerging as a major multimedia influencer in enterprise technology.

Nearly 10 million technology professionals will use Solutions Review to help them evaluate business software this year. Solutions Review is a vendor-agnostic resource that does not endorse any individual product or service.

Download Solutions Review’s 2023 Buyer’s Guide for Data Integration Tools.

The post Solutions Review Releases 2023 Buyer’s Guide for Data Integration Tools appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
What’s Changed: 2021 Gartner Magic Quadrant for Data Integration Tools https://solutionsreview.com/data-integration/whats-changed-2021-gartner-magic-quadrant-for-data-integration-tools/ Thu, 02 Sep 2021 16:29:17 +0000 https://solutionsreview.com/data-integration/?p=4326 The editors at Solutions Review highlight what’s changed since the last iteration of Gartner’s Magic Quadrant for Data Integration Tools and provide an analysis of the new report. Analyst house Gartner, Inc. has released its 2021 Magic Quadrant for Data Integration Tools. According to the researcher, the marketplace for data integration tools is seeing “renewed […]

The post What’s Changed: 2021 Gartner Magic Quadrant for Data Integration Tools appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
Magic Quadrant for Data Integration Tools

The editors at Solutions Review highlight what’s changed since the last iteration of Gartner’s Magic Quadrant for Data Integration Tools and provide an analysis of the new report.

Analyst house Gartner, Inc. has released its 2021 Magic Quadrant for Data Integration Tools. According to the researcher, the marketplace for data integration tools is seeing “renewed momentum” driven largely by requirements for hybrid and multicloud data integration, augmented data management, and data fabric designs. Gartner defines data integration tools as “the discipline comprising the architectural patterns, tools and methodologies” that enable data access, transformation, and processing.

This space includes vendors that offer standalone software products to enable data access and delivery infrastructure for a number of scenarios like data engineering, support for the delivery of analytics, data lake enablement, self-service data preparation, cloud migration, sourcing and delivery of master data, interenterprise data sharing, data consistency, and data fabric. As a strategic planning assumption, the analyst house predicts that manual data management tasks will be reduced by 45 percent via machine learning and automation.

Data integration tools are a stable growth market, with leading vendors continuing to lose ground to niche and up-and-comers. While the top-five players in the space had a 71 percent market share in 2017, that number has dropped dramatically to 56 percent in 2020. In large part, this is due to mega-vendors like Informatica and Talend honing in more on their iPaaS solutions. The smaller providers gaining ground are doing so vis a focus on leadership in a specific data integration style.

Download Link to Data Integration Buyer's Guide

Other key drivers in the data integration tools market include an end-user push toward augmented data integration, an increase in data engineering requirements, the emerging discipline of DataOps, a notable increase in hybrid and intercloud data management, and the convergence of application and data integration. Gartner concludes: “More data and analytics leaders are realizing that data integration is a critical component of their data management infrastructure.”

In this Magic Quadrant, Gartner evaluates the strengths and weaknesses of 18 providers that it considers most significant in the marketplace and provides readers with a graph (the Magic Quadrant) plotting the vendors based on their Ability to Execute and their Completeness of Vision. The graph is divided into four quadrants: niche players, challengers, visionaries, and leaders. At Solutions Review, we read the report, available here, and pulled out the key takeaways.

Gartner adjusts its evaluation and inclusion criteria for Magic Quadrants as software markets evolve. As a result, CloverDX was added while Actian, Information Builders, and Hitachi Vantara were removed for no longer meeting Gartner’s inclusion criteria. Representative vendors include Adeptia, CloverDX, Denodo, Fivetran, HVR, IBM, Informatica, Matillion, Microsoft, Oracle, Precisely, Qlik, Safe Software, SAP, SAS, SnapLogic, Talend, and TIBCO Software.

Informatica once again leads the pack as the top-ranked vendor in this report. With more than 10,000 data integration tools customers, product investments aligned to the data fabric vision, and excellent support for data engineering use cases, it’s not a surprise. IBM holds the next most valuable position with a modular architecture that supports intercloud and hybrid integration and strong support for DataOps. IBM also offers capabilities that support the data fabric design.

For the second-straight iteration, SAP, Oracle, and Talend form a bloc in the middle of the Leaders quadrant. SAP touts a product strategy that focuses on unification and hybrid. The company also caters to the data processing needs of enterprise users. Oracle is aligned to the data fabric use case and offers balanced support for a variety of data delivery styles. Talend brings a governed approach to operationalizing self-service data preparation and strong features for those in need of a tool for data engineering.

Microsoft and Denodo both saw their positions improve slightly along the graph’s middle axis, with the former returning from a stay among the Challengers. Microsoft touts an overall low total cost of ownership, speed of implementation, and ease of use. The mega-vendor also enjoys the advantage of broad enterprise familiarity with its products. Denodo’s product strategy focuses on logical and distributed architectures and the company invests 25 percent of its total revenue in research and development.

Qlik headlines the Challengers bracket for 2021 with a strong core of functionality, capabilities for multipersona, and ease of use, particularly around installation, configuration, and code generation. Even through a bit of placement regression, SAS retains a strong position with enhanced support for different user personas and enabling users to do complex transformations. Reference customers also enjoy SAS’s professional services, ranking them highly for services and support.

TIBCO Software recently made a major splash in acquiring competitor Information Builders. The company also offers modular tools that support what Gartner refers to as a “connect and collect” vision for data integration. TIBCO touts an improved bulk/batch integration capability as well. Precisely rounds out the Challengers column with support for cloud data migration, strong potential for data management platform connectivity, and reference customer ease of use.

SnapLogic represents the lone Visionaries provider for 2021 featuring augmented guidance and automation and an easy-to-understand pricing model. SnapLogic recently dropped its August 2021 product release with SnapLogic Flows and new API lifecycle management capabilities. SnapLogic Flows offers an intuitive user experience for non-technical business users and enables the development of application integrations and data flow automation without code.

Most of the vendors included in the Niche Players column showed significant improvement from their 2020 positions. Among companies in this bracket, Safe Software holds the top position on Gartner’s horizontal axis. The provider offers spatial data integration features and no-and-low code data preparation. Safe is also available via a number of deployment options. Adeptia’s unified platform supports the core requirements of connectivity and adapters, bulk/batch data delivery, and data capture and propagation. Reference customers also enjoy Adeptia’s pricing and value.

Fivetran offers a cloud data engineering-focused product strategy and low cost of ownership for low data latency. Fivetran also touts end-to-end automation and vendor-managed data with pre-built schemas. Matillion is highly focused on cloud data engineering and offers deployment options to solve common data sovereignty issues. Reference customers are highly satisfied with Matillion as well. HVR’s focus is mainly on data replication, though it also offers support for complex data transformations. HVR automates data replication and synchronization tasks as well.

New entrant CloverDX offers a product that hones in on solving intricate data problems and touts a customer focus reflected on services, pricing, packaging, and deployment. Other key capabilities to note include data discovery, classification, and anonymization support.

Read Gartner’s Magic Quadrant for Data Integration Tools.

Download Link to Data Integration Vendor Map

The post What’s Changed: 2021 Gartner Magic Quadrant for Data Integration Tools appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
The Biggest Data Integration News Items During the First Half of 2021 https://solutionsreview.com/data-integration/the-biggest-data-integration-news-items-during-the-first-half-of-2021/ Thu, 22 Jul 2021 19:09:03 +0000 https://solutionsreview.com/data-integration/?p=4254 Our editors curated this list of the biggest data integration news items during the first half of 2021, as highlighted on Solutions Review. Data integration is a combination of technical and business processes used to combine different data from disparate sources. This process generally supports the analytic processing of data by aligning, combining, and presenting […]

The post The Biggest Data Integration News Items During the First Half of 2021 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>
The Biggest Data Integration News Items During the First Half of 2021

Our editors curated this list of the biggest data integration news items during the first half of 2021, as highlighted on Solutions Review.

Data integration is a combination of technical and business processes used to combine different data from disparate sources. This process generally supports the analytic processing of data by aligning, combining, and presenting each data store to an end-user. Organizations commonly use data integration software for enterprise-wide data delivery, data quality, governance, and analytics.

Part of Solutions Review’s ongoing analysis of the big data marketplace includes covering the biggest data integration news stories which have the greatest impact on enterprise technologists. This is a curated list of the most important data integration news stories from the first half of 2021. For more on the space, including the newest product releases, funding rounds, and mergers and acquisitions, follow our popular news section.

Download Link to Data Integration Buyer's Guide

The Biggest Data Integration News Items During the First Half of 2021

Workato to Extend Product and Technology with $110M New Funding (January)

Workato announced that it has secured a Series D funding round worth $110 million according to a press release on the company’s website. The capital injection brings Workato’s total earned to more than $221 million since its founding in 2013. The round was led by Altimeter Capital and Insight Partners. Workato will use the new investment to expand product innovation and technology development. It will also broaden its customer success program and host its first user conference in 2021. Workato previously raised $70 million in Series C funding in November 2019.

In a media statement about the news, Workato co-founder and CEO Vijay Tella said “There’s been explosive growth in business apps and cloud technologies, but their potential remains largely untapped. This explosion has created tech chaos with siloed data, fragmented business processes and broken UX. Workato addresses this with a single platform built for business and IT that easily, reliably, and securely connects their apps, data, and business processes so teams can work smarter and faster. With our new investment, we’re looking forward to helping other companies around the world use integration-led automation to transform how they work.”

Fivetran Releases New Data Connectors for IBM, Oracle and SAP (January)

Fivetran announced a new suite of data connectors for IBM, Oracle, and SAP in a press release on the company’s website. The pre-built connectors add to the more than 150 already maintained by Fivetran. Specifically, Fivetran now manages data integration for enterprise databases like IBM Db2 on Linux, Unix, and Windows, as well as SAP HANA. Fivetran data connectors feature end-to-end automated replication capabilities such as managed maintenance, automatic schema management, failure recovery, and incremental updates.

In a media statement about the news, Fivetran Vice President of Product Fraser Harris said “Enterprise data teams just don’t have the time to build, manage and maintain the enormous number of data integrations that are being requested. We see it as our mission at Fivetran to eliminate this burden, freeing up invaluable time by providing an extensive library of easy-to-use, fully managed and maintained data connectors. Because we observe tens of thousands of customer connector configurations and manage more than one million syncs daily, we’re able to apply our insight and continuous pipeline optimizations across all of our 1600-plus customers, providing added value for all.”

Matillion Raises $100 Million in Series D Funding After Banner 2020 (February)

Matillion has announced $100 million in Series D funding according to a press release on the company’s website. The round, which was led by Lightspeed Venture Partners with participation from existing investors, brings Matillion’s total capital raised to more than $160 million since its founding in 2011. The funding will enable Matillion to meet the demand for its cloud data integration products. The news comes on the heels of the company’s release of Matillion Exchange ETL marketplace from July.

In a media statement on the news, Matillion CEO Matthew Scullion said “Simply put, Matilion makes data useful. Matillion accelerates the value of today’s cloud data platforms, ensuring that data teams can work smarter and faster. We look forward to our next phase of growth, as we empower more enterprise customers with efficient, affordable and flexible solutions that turn data into their most strategic asset.”

MuleSoft Composer is the First Salesforce UI-Embedded Integration Tool

MuleSoft announced the release of MuleSoft Composer, a new solution designed to connect apps and data to Salesforce for faster connected experiences. MuleSoft Composer is the first generally available Composer solution according to the vendor, and is built atop the company’s popular Anypoint Platform. It’s also the first integration solution embedded in the Salesforce UI. Composer does not require users to write code. Salesforce acquired MuleSoft in March 2018.

In a media statement on the news, MuleSoft CPO Meir Amiel said “Across every business function and industry, our customers are reimagining their customer, employee, partner and product experiences. MuleSoft Composer will continue to expand our vision to help companies realize speed, agility and efficiency by empowering anyone – from IT to now business – to unlock apps and data with clicks, not code, and without losing security and governance.”

Talend Acquired by Thoma Bravo in $2.4 Billion-Dollar Deal (March)

Talend was acquired by private equity firm Thoma Bravo for $2.4 billion, according to a press release on the company’s website. Upon completion of the transaction, Talend will become a private company. The deal will enable Talend to continue investing in its cloud transition and enhance its data integration offerings for customers. The news comes on the heels of Talend’s September unveiling of Talend Trust Score, a tool that instantly assesses the reliability of an organization’s data.

In a media statement about the news, Talend CEO Christal Bemont said “I am pleased with how the team has executed through the challenges of 2020 and successfully delivered on the strategic initiatives and targets we laid out a year ago. As we look forward, the transaction with Thoma Bravo – a firm with a successful track record of helping the companies it acquires achieve long-term, sustainable growth – will provide Talend with additional capital, resources, and expertise to execute against our mission and leverage a large market opportunity to help all organizations become data-driven.”

Denodo Unveils New Cloud Data Integration Tool Denodo Standard (March)

Denodo recently announced the release Denodo Standard, a new data integration solution available on leading cloud marketplaces. Denodo Standard utilizes the company’s data virtualization engine to enable real-time analytics and data services without replicating the data into another repository. The product can be purchased directly from a customer’s preferred cloud marketplace of choice, whether on AWS, Microsoft Azure and Google Cloud. The offering also touts by-the-hour pricing as well.

In a media statement about the news, Denodo Senior Vice President Ravi Shankar said “The goal of Denodo Standard is to enable organizations of all sizes to unlock the value of their data assets faster, and with a lower cost and resource commitment. By leveraging cloud infrastructure automation, usage-based pricing, and free trials in the cloud marketplaces, we have lowered barriers to start using Denodo for data integration.”

Trifacta Launches Industry-First Data Engineering Cloud (April)

Trifacta recently announced the release of Trifacta Data Engineering Cloud, a major expansion of its visual data preparation platform. The product is headlined by multi-cloud support with solutions that are optimized for each platform. Users can choose between ETL or ELT, or an optimal combination of the two based on cost. Trifacta Data Engineering Cloud also touts universal connectivity to more than 180 enterprise data sources both on-prem and in the cloud. Trifacta’s cloud integrates with all toolchains as well.

In a media statement about the news, Trifacta CEO Adam Wilson said “Accelerating data preparation and democratizing ETL for these users and their cloud data warehousing projects requires an enterprise-grade data engineering platform that is open, intelligent, and self-service. We’ve created the Trifacta Data Engineering Cloud to meet these needs.”

Oracle Unveils Oracle Cloud Infrastructure GoldenGate Service Offering (April)

Oracle recently announced the release of Oracle Cloud Infrastructure GoldenGate service, which makes its GoldenGate data integration technology available via a fully-managed cloud service. The product provides customers with a real-time data fabric platform to design, executive, orchestrate, and monitor their data replication and streaming data events. Oracle Cloud Infrastructure GoldenGate service also touts an elastic, pay-per-use framework for general purpose database replication, data integration and real-time ingestion to the cloud.

In a media statement on the news, Oracle EVP of Mission-Critical Database Technologies Juan Loaiza said “Oracle GoldenGate is a highly trusted data integration solution for thousands of customers worldwide, including 84 percent of Fortune Global 100 companies. Oracle is now delivering the enterprise-hardened GoldenGate technology as a cloud-native elastic service to provide customers a simple-to-use, integrated, operational and analytic real-time data fabric. It cuts both the time and cost of creating a data integration solution in half compared to the options provided on competing clouds.”

Jitterbit Unveils New Integration Marketplace and Resource Hub (May)

Jitterbit recently announced Jitterbit Marketplace, a new solution and expertise marketplace where vendors can provide customers with Jitterbit-certified resources. The marketplace includes Application Connectors, Integration Recipes, Process Templates, and related delivery services for implementing API integration solutions. At launch, Jitterbit Marketplace touts more than 200 solutions assets and over 20 Jitterbit-certigi9ed service providers to choose from.

In a media statement on the news, Jitterbit Senior Vice President of Partner Development, Channels and Alliances Ron Wastal said “API integrations are historically done by vendors or solution consultants that code them as one-offs, but the new Jitterbit Marketplace changes that.   We want our ecosystem to build on top of our platform and create repeatable integrations that benefit end-users.”

MuleSoft Unveils Latest Version of Anypoint with DataGraph Platform (May)

MuleSoft recently announced the release of the latest version of its flagship integration and API platform, MuleSoft Anypoint with DataGraph. The latest version, which the company is referring to as a “major release” enables developers to quickly access and serve data from multiple existing APIs with a single query, and without having to write any code. MuleSoft’s Anypoint DataGraph will be generally available this month.

In a media statement on the news, MuleSoft Chief Product Officer Meir Amiel said “Every industry is under pressure to deliver digital innovation faster to meet rising customer expectations and adapt to changing market dynamics. Any digital project requires integration, data, security, governance and APIs. With MuleSoft, companies are rethinking their approach to integration and APIs based on secure, reusable building blocks to solve their most immediate need and lay the foundation for ongoing innovation.”

Clearlake-Backed Precisely Acquires Data Governance Firm Infogix (May)

Clearlake Capital-backed Precisely recently announced that it signed a definitive agreement to acquire data governance software provider Infogix. Terms of the deal were not made available, but the merger is a major move coming on the heels of Clearlake’s acquisition of Precisely from March. Syncsort re-branded to Precisely as a result of the company’s acquisition of the Pitney Bowes software data business in August 2019. The transaction is expected to close by the end of Q2 2021.

In a media statement on the news, Precisely CEO Josh Rogers said: “As companies seek to position and differentiate themselves for the digital era, they quickly realize that success requires trusted data. Precisely provides that foundation with maximum accuracy, consistency and context so businesses can make confident data-driven decisions. Infogix is a perfect fit for Precisely, with highly complementary data quality products and services, and a leading data governance solution that our customers have been seeking.”

Download Link to Data Integration Vendor Map

The post The Biggest Data Integration News Items During the First Half of 2021 appeared first on Best Data Integration Vendors, News & Reviews for Big Data, Applications, ETL and Hadoop.

]]>