Summer Release 2017: Focus on operational efficiency and artificial intelligence

By Dinesh Chandrasekhar

SnapLogic Enterprise Integration Cloud’s Summer Release for 2017 (v4.10) is generally available for customers today. We are excited to share a lot of new features and enhancements that revolve around the themes of operational efficiency for digital transformation, continuing innovation in the use of Iris Artificial Intelligence, and added support for Cloud Data Warehouse and Big Data. While we stay focused on offering one of the finest and easiest user experiences in the iPaaS space, we wanted to extend that experience over to the operational side as well.

A few key highlights from this release that focus on improving operational efficiencies and the use of Artificial Intelligence:

  • AI-Powered Pipeline Optimization: SnapLogic’s Iris technology now evaluates pipeline configuration and operation to recommend specific performance improvements based on machine learning, delivered to users via dashboard alerts and notifications.
  • Local Snaplex Dashboard: Ultra Pipelines have been used by our customers in mission-critical use cases that warrant uninterrupted data flow across multiple data sources. Now, we have introduced a local dashboard that provides full visibility into Ultra Pipelines. This offers customers increased confidence that their high-priority projects are running automatically without any interruption.
  • API Threshold Notifications: On a similar vein around dashboards, we bring in some monitoring capabilities that focus on operations and project development. Real-time alerts and notifications provide customers with an early warning system to improve monitoring of complex integrations projects. Alerts can be set up to notify stakeholders when API limits are approached and exceeded when users are added or removed from the platform, and when projects are modified.
  • Project-level Migration: We’ve been listening to your requests! With the fast pace at which integration projects are being executed and delivered across the enterprise, it is pertinent that project lifecycles are not a bottleneck. We’ve introduced a new feature that automates the migration of projects across departments and organizations, eliminating hand-coding and hard-coding, resulting in seamless project migrations.
  • Public API for Groundplex Installation: A new API is now available to automate the installation and configuration of on-premises SnapLogic nodes, eliminating the time and effort to manually download and install additional nodes. 

A few key highlights from this release that focus on enhancing our already comprehensive integration capabilities into cloud data warehousing and big data:

  • Microsoft Azure SQL Snap Pack: A brand new Snap Pack that allows customers to quickly and easily migrate on-premises databases to Microsoft Azure SQL Database for fast cloud-based reporting and analytics.
  • Teradata Select Snap: Expands the Teradata Snap Pack, allowing users to retrieve data from a Teradata database to display on a table for easy reporting tasks.
  • Parquet Writer Snap: Allows users to store data in specific partitions in Hadoop HDFS, improving the ability to create reports and derive meaningful insights from big data.
  • Parquet Reader/Writer Snap: Allows users to write Parquet files to Amazon S3 and AWS Identity & Access Management in addition to HDFS, expanding SnapLogic’s support for cloud-based data warehousing via Redshift.

Watch the Summer 2017 Release webinar to learn all the features and enhancements across our Snaps, IoT, CRM, and more.

Dinesh Chandrasekhar is Director of Product Marketing at SnapLogic. Follow him on Twitter @AppInt4All.

The bigger picture: Strategizing your data warehouse migration

By Ravi Dharnikota

If your organization is moving its data warehouse to the cloud, you can be confident you’re in good company. And if you read my last blog post about the six-step migration process, you can be even more confident that the move will go smoothly. However, don’t pull the trigger just yet. You’ve got a bit more planning to do, this time at a more strategic level.

First, let’s recap the migration process I covered in my last post, of the data warehouse itself. In that blog post, I broke down all the components of this diagram:

Data Warehouse Migration Process

Now, as you can see in the diagram below, the data warehouse migration process itself is part of a bigger picture of migration planning and strategy. Let’s take a look at the important pre-migration steps you can take help to ensure success with the migration itself.

Migration Strategy and Planning

Step 1: Define Goals and Business Case. Start the planning process with a clear picture of the business reasons for migrating your data warehouse to the cloud. Common goals include:

  • Agility in terms of both the business and the IT organization’s data warehousing projects.
  • Performance on the back end, to ensure timeliness and availability of data, and on the front end, for fast end-user query response times.
  • Growth and headroom to ease capacity planning; the elastic scalability of cloud resources mitigates this problem.
  • Cost savings on hardware, software, services, space, and utilities.
  • Labor savings from reduced needs for database administration, systems administration, scheduling and operations, and maintenance and support.

Step 2: Assess the current data warehouse architecture. If the current architecture is sound, you can plan to migrate to the cloud without redesign and restructuring. If architecturally sufficient for BI but limited for advanced analytics and big data integration, you should review and refine data models and processes as part of the migration effort. If the current architecture struggles to meet current BI requirements, plan to redesign it as you migrate to the cloud.

Step 3: Define the migration strategy. A “lift and shift” approach is tempting, but it rarely succeeds. Changes are typically needed to adapt data structures, improve processing, and ensure compatibility with the chosen cloud platform. Incremental migration is more common and usually more successful.

As I mentioned in my last blog post, a hybrid strategy is another viable option. Here, your on-premises data warehouse can remain operating as the cloud data warehouse comes online. During this transition phase, you’ll need to synchronize the data between the old on-premises data warehouse and the new one that’s in the cloud.

Step 4: Select the technology including the cloud platform you’ll migrate to, and which tools you’ll need for the migration. There are many types of tools and services that can be valuable:

  • Data integration tools are used to build or rebuild ETL processes to populate the data warehouse. Integration platform as a service (iPaaS) technology is especially well suited for ETL migration.
  • Data warehouse automation tools like WhereScape can be used to deconstruct legacy ETL, reverse engineer and redesign ETL processes, and regenerate ETL processes without the need to reconstruct data mappings and transformation logic.
  • Data virtualization tools such as Denodo provide a virtual layer of data views to support queries that are independent of storage location and adaptable to changing data structures.
  • System integrators and service providers like Atmosera can be helpful when manual effort is needed to extract data mappings and transformation logic that is buried in code.

Using these tools and services individually or in combination can make a remarkable difference in your data warehouse, serving to speed and de-risk the migration process.

Step 5: Migrate and operationalize; start by defining test and acceptance criteria. Plan the testing, then execute the migration process to move schema, data, and processing. Execute the test plan and, when successful, operationalize the cloud data warehouse and migrate users and applications.

Learn more at SnapLogic’s upcoming webinar

To get the full story on data warehouse cloud migration, join me for an informative SnapLogic webinar, “Traditional Data Warehousing is Dead: How digital enterprises are scaling their data to infinity and beyond in the Cloud,” on Wednesday, August 16 at 9:00am PT. I’ll be presenting with Dave Wells, Leader of Data Management Practice, Eckerson Group, and highlighting tangible business benefits that your organization can achieve by moving your data to the cloud. You’ll learn:

  • Practical best practices, key technologies to consider, and case studies to get you started
  • The potential pitfalls of “cloud-washed” legacy data integration solutions
  • Cloud data warehousing market trends
  • How SnapLogic’s Enterprise Integration Cloud delivers up to a 10X improvement in the speed and ease of data integration

Register today!

Ravi Dharnikota is Chief Enterprise Architect at SnapLogic. Follow him on Twitter @rdharn1

SnapLogic seeks partners to help drive LATAM growth

By Carlos Hernandez Saca

I am very excited to announce that SnapLogic is now seeking Latin America (LATAM) - Mexico, Brazil, Argentina, Colombia and Chile – partners to support our global expansion. This news follows our successful launch of SnapLogic’s global/US partner program, SnapLogic Partner Connect, just over a year ago.

We’ve got momentum. We’ve seen first-hand that SnapLogic dramatically transforms how companies integrate data, applications, and devices for digital business. In December 2016, SnapLogic secured an additional $40 million in funding to accelerate its global expansion. And we see the channel as an important driver of this expansion.

We’re making this call for partners to help us build a rich ecosystem of technology, consulting, and reseller partners in the LATAM market so that joint customers see the benefits of adopting software-as-a-service (SaaS) applications, cloud data management, and big data technologies. SnapLogic’s technology enables this by accelerating integration and driving digital transformation in the enterprise.

The SnapLogic Partner Connect program includes several categories, reflecting the breadth of the SnapLogic platform and depth of adjacent partner solutions. Partner types include:

  • Technology Partners– These are providers of technologies with which SnapLogic integrates, and includes partners such as Salesforce, Workday, AWS, Microsoft, Cloudera, and Snowflake.
  • OEM/Managed Services Partners- OEM partners integrate SnapLogic into their product or service offerings. OEM partners include Manthan, Reltio, Planview, and Verizon.
  • Global Systems Integrator Partners– SnapLogic’s GSI partners are implementing SnapLogic as part of comprehensive digital transformation projects at some of the world’s leading enterprises. These partners include firms such as Accenture, PwC, Cognizant, HCL, Tata Consultancy Services, and Tech Mahindra.
  • Consulting Partners– These are organizations that provide integration consulting and best practices services, and include partners such as Interworks, Softtek, and Semantix Brazil.
  • Reseller Partners– The roster of SnapLogic international resellers is growing rapidly and increasing SnapLogic’s presence in new markets worldwide. Resellers include MatrixIT in Israel, KPC in France, Sogeti in United Kingdom, Systemation in the Netherlands, Softtek in Mexico, DRZ and Semantix in Brazil.

SnapLogic CEO Gaurav Dhillon says, “Extending our partners program into the LATAM channel is a crucial part of our business growth strategy. Working alongside strategic channel partners will provide an exciting opportunity to reach new LATAM customers and help them realize their digital transformation goals through connected data and applications.”

And I agree. The market opportunity for data, application, and device integration is large and growing, representing a unique opportunity for our partners in LATAM. As customers across the region increasingly use our modern, cloud-first platform to integrate and manage data across cloud, big data, IoT, mobile, social, and more, we look forward to working with new partners in LATAM who can help accelerate their digital transformation efforts.

SnapLogic’s self-service integration platform was recognized as a Leader in Gartner’s 2017 Magic Quadrant for Enterprise Integration Platform as a Service (iPaaS). Global enterprise customers such as Adobe, AstraZeneca, Box, Capital One, GameStop, Verizon and Wendy’s rely on SnapLogic to automate business processes, accelerate analytics and drive digital transformation. If you’re interested in partnering with us, we want to hear from you.

Carlos Hernandez Saca is the Sales & Partnership Area Director for Latin America at SnapLogic. You can follow him on Twitter @rivaldo71.

 

Our live demo series is back

By Rich Dill

Due to popular demand, we’re bringing back a regular demo series where you get a first-hand look at SnapLogic’s Enterprise Integration Cloud in action, hear from a technology expert, and have your questions answered … live.

If you want to learn more about what SnapLogic can bring to your application and data integration plans, this demo is for you. Join me Wednesday, August 9 from 10:30AM – 11:00AM PT for the kick off of our Live Demo Series.

In this month’s demo, I will be highlighting the multiple reasons why the Enterprise Integration Cloud is the best option for quickly connecting and efficiently managing legacy and modern data, cloud application, and API integration needs.

I will be providing an overview of iPaaS and SnapLogic along with a detailed look inside the industry’s first AI-powered integration technology, Iris. I will also highlight how to connect an application to a database in minutes.

Each month we will be showcasing our leading industry integration cloud, so if you can’t attend this demo, we’ll have others for you to participate in with different themes each time.

Register now for the SnapLogic Live Demo Series

Rich Dill is Enterprise Solutions Architect at SnapLogic. You can follow him on Twitter @richdill.

 

 

Get your game plan on: Data warehouse migration to the cloud

By Ravi Dharnikota

You’ve decided to move your data warehouse to the cloud, and want to get started. Great! It’s easy to see why – in addition to the core benefits I wrote about in my last blog post, there are many more benefits associated with cloud data warehousing: incredibly fast processing, speedy deployment, built-in fault tolerance and disaster recovery and, depending on your cloud provider, strong security and governance.

A six-step reality check

But before you get too excited, it’s time to take a reality check; moving an existing data warehouse to the cloud is not quick, and it isn’t easy. It is definitely not as simple as exporting data from one platform and loading to another. Data is only one of the six warehouse components to be migrated.

Tactically and technically, data warehouse migration is an iterative process and needs many steps to migrate all of the components, as illustrated below. Here’s everything you need to consider in migrating your data warehouse to the cloud.

1) Migrating schema: Before moving warehouse data, you’ll need to migrate table structures and specifications. You may need to make structural changes as part of the migration, including indexing or partitioning – do they need to be rethought?

Data Warehouse Migration Process

2) Migrating data: Moving very large volumes of data is process intensive, network intensive, and time-consuming. You’ll need to map out how long will it take to migrate and if you can accelerate that process. You may need to restructure as part of schema migration and transform data as part of the data migration? Alternatively, can you transform in-stream or should you pre-process and then migrate?

3) Migrating ETL: Moving data may be the easy part when compared to migrating ETL processes. You may need to change the code base to optimize for platform performance and change data transformations to sync with data restructuring. You’ll need to determine if data flows should remain intact or be reorganized. As part of the migration, you may need to reduce data latency and deliver near real-time data. If that’s the case, would it make sense to migrate ETL processing to the cloud, as well? Is there a utility to convert your ETL code?

4) Rebuilding data pipelines: With any substantive change to data flow or data transformation, rebuilding data pipelines may be a better choice than migrating existing ETL. You may be able to isolate individual data transformations and package them as executable modules. You’ll need to understand the dependencies among data transformations to construct optimum workflow and the advantages you may gain – performance, agility, reusability, and maintainability – by rebuilding ETL as modular data pipelines using modern, cloud-friendly technology. 

5) Migrating metadata: Source-to-target metadata is a crucial part of managing a data warehouse; knowing data lineage, and tracing and troubleshooting is critical when problems occur. How readily will this metadata transfer to a new cloud platform? Are all of the mappings, transform logic, dataflow, and workflow locked in proprietary tools or buried in SQL code? You’ll need to determine if you’ll be able to export and import by either reverse engineering the metadata or rebuilding it from scratch.

6) Migrating users and applications: The final step in the process is migrating users and applications to the new cloud data warehouse, without interrupting business operations. Security and access authorizations may need to be created or changed, and BI and analytics tools should be connected. To do this, what communication is needed and with whom?

Don’t try to do everything at once

A typical enterprise data warehouse contains a large amount of data describing many business subject areas. Migrating an entire data warehouse in a single pass is usually not realistic. Incremental migration is the smart approach when “big bang” migration isn’t practical. Migrating incrementally is a must when undertaking significant design changes as part of the effort.

However, incremental migration brings new considerations. Data location should be transparent from a user point of view throughout the period when some data resides in the legacy data warehouse and some in the new cloud data warehouse. Consider a virtual layer as a point of access to decouple queries from data storage location.

A hybrid strategy is another viable option. With a hybrid approach, your on-premises data warehouse can remain operating as the cloud data warehouse comes online. During this transition phase, you’ll need to synchronize the data between the old on-premises data warehouse and the new one that’s in the cloud.

Cloud migration tools to the rescue

The good news is, there are many tools and services that can be invaluable when migrating your legacy data warehouse to the cloud. In my next post, the third and final in this series, I’ll explore the tools for data integration, data warehouse automation, and data virtualization, and system integrator resources that can speed and de-risk the process.

Learn more at SnapLogic’s upcoming webcast, “Traditional Data Warehousing is Dead: How digital enterprises are scaling their data to infinity and beyond in the Cloud,” on Wednesday, August 16 at 9:00am PT. I’ll be presenting with Dave Wells, Data Management Practice Lead, Eckerson Group, and highlighting tangible business benefits that your organization can achieve by moving your data to the cloud. You’ll learn:

      • Practical best practices, key technologies to consider, and case studies to get you started
      • The potential pitfalls of “cloud-washed” legacy data integration solutions
      • Cloud data warehousing market trends
      • How SnapLogic’s Enterprise Integration Cloud delivers up to a 10X improvement in the speed and ease of data integration

Sign up today!

Ravi Dharnikota is Chief Enterprise Architect at SnapLogic. Follow him on Twitter @rdharn1

Gaurav Dhillon talks disconnected data on BBC Business Matters radio show

By Scott Behles 

After his inaugural appearance on BBC Business Matters in May, SnapLogic founder and CEO, Gaurav Dhillon, was back on the airwaves with the BBC World Service radio show last week.

Broadcast around the world, BBC Business Matters reaches millions of listeners every day with expert business views from journalists, academics, and business leaders. On this occasion, Gaurav joined BBC host Fergus Nicoll and Professor Jasper Kim of Ewha University in Seoul to discuss the pressing business stories of the day, alongside the findings of SnapLogic’s recent research, The High Cost of Disconnected Data.

As Gaurav explains in the clip below, the cost of disconnected data for businesses is staggering. Large enterprises in the UK and US are wasting $140 billion a year due to missed opportunities and lost productivity. The areas hardest hit by disconnected data? Companies can’t get new products to market quickly enough, and customer experience suffers.

Even though we are, as Gaurav puts it, “living in an age of data enlightenment,” in reality, old data silo habits die hard, with disconnected systems and collaboration challenges across the enterprise still getting in the way. Innovation is the lifeblood of business, and if it’s to thrive, these data barriers need to be torn down.

You can listen to the full broadcast here. The focused discussion about disconnected data starts around the 26:30 mark.

Scott Behles is Head of Corporate Communications at SnapLogic. Follow him on Twitter @sbehles

Data and analytics – behind and after an acquisition

By Karen He

Now more than ever, organizations need to move beyond innovation to grow their business, stay competitive, and remain relevant. And it’s a combination of data and analytics that provides businesses the right insights to help them move in the right direction and beyond innovation. Amazon and Walmart have done just that. Their competitiveness and relentless business tactics to become leaders in the extremely competitive retail world have set them apart from many. A shared tactic, acquiring smaller competitors in their space, is something both companies have done well, and data is at the core.  

In late 2016, Walmart strengthened its e-commerce strategy by acquiring Jet.com. It continued into 2017 by acquiring five other online retailers, including Moosejaw, ModCloth, Bonobos, Shoebuy.com, and Hayneedle.com.

Amazon, on the other hand, and just this year, expanded its brick-and-mortar expansion strategy with its first physical store in New York City to its pending purchase of Whole Foods.

Acquisitions are easier said than done. The teams behind most retail conglomerates like Walmart or Amazon do not buy companies by merely following their instincts. Instead, they rely heavily on data from multiple sources to develop a business strategy to grow their business. In acquisitions, data would show retail leaders whether an acquisition is strategically feasible for the business.

The data behind the decision

Nowadays, the amount of data available to organizations is both a blessing and a curse. Businesses are surrounded by deep pockets of their own data, residing in different cloud-based and on-premises applications and databases. For the most part, this volume of enterprise data is extremely hard to retrieve without technical assistance. As a result, businesses continuously face the challenge of spending an immense amount of time and effort simply rounding up and compiling data.

Subsequently, retail leaders gather and analyze data to make an informed decision on whether to purchase another company or not. But most retailers are not in the position to make such decisions because that insightful and illusive data resides in multiple places. They pivot from one tool to another, sifting through thousands of data sets before even getting into the analysis. These cumbersome, manual processes prevent retailers from gaining real-time insights, potentially preventing them from taking on first-mover initiatives and leapfrog its competitors. Until now.

Companies that can pull data and derive insights in real-time are empowered to transform and grow their business. Retailers need to pull data on-demand to be able to visualize complete insights to make a sound business decision. In Walmart and Amazon‘s cases, they tapped into extensive data sources to understand whether they would gain more value by growing organically or by dropping millions of dollars to acquire established companies. Of course, we know what they’ve acquired, but not necessarily what they didn’t acquire or why.

Post-acquisition alignment

Beyond all the pre-acquisition data and number crunching, both Amazon and Walmart are aware of the many M&A processes needed post-acquisition. Once a company acquires another company, the parent company must realign the business by consolidating virtually all the departments, operations, and processes between both companies. In Amazon and Walmart’s case, consolidating business operations and supply chains involve complex data migrations. For conglomerates like Amazon and Walmart to have a seamless flow of information, subsidiaries need to migrate their data from all their systems and applications into their parent company.

Without realignment, business users across functions can become unproductive due to the lack of data or inefficient manual labor to connect data files from disparate systems and applications. A marketing department alone may have at least a half a dozen marketing applications, including CRM, marketing campaign automation, web analytics, marketing intelligence, predictive analytics, content management and social media management systems. Gaps in marketing reports and insights, for example, emerge when marketers use duplicate data from different marketing applications, and result in potentially lower business performance. To fold in existing operations and processes, companies need a smarter way to connect systems together or migrate data from one system to another.

Amazon and Walmart are proof points of how companies must innovate and grow in the competitive retail market. Businesses across industries should also look into their data to unearth growth opportunities. Complete, real-time data and analytics empower business professionals to expand their business and stay competitive in the market.

Learn more about connecting systems and applications to fuel rich data and analytics in this recorded webcast.

Karen He is Product Marketing Manager at SnapLogic. Follow her on Twitter @KarenHeee.