Are you ready? Take the cloud data warehouse readiness test

By Ravi Dharnikota

Is a cloud data warehouse right for you? SnapLogic teamed up with The Eckerson Group and SnapLogic technology partner Snowflake to help you prepare your business for a cloud data warehouse migration.

Today, there are many questions facing most organization in making the decision to move to the cloud.

Questions include:

  • When is the best time to migrate to a cloud data warehouse?
  • Should an organization embrace the cloud now or stick with an on-premises solution?
  • Should it migrate an existing data warehouse to the cloud, or start with specific workloads on a case-by-case basis?
  • Should it build net new data marts in the cloud and work its way up to a cloud data warehouse?

If these are questions you’re asking yourself, look no further. The Eckerson Group has crafted a readiness assessment to help you understand the questions and issues you need to address before embarking on a cloud data warehouse migration.

The assessment tool takes fewer than 10-15 minutes to complete, and at the end delivers a report that you can share with team members on next steps.

Take the survey now and let us know what you think!

Ravi Dharnikota is Chief Enterprise Architect at SnapLogic. Follow him on Twitter @rdharn1

 

SnapLogic Summer 2017 Release: Automating your processes for greater productivity

In the Summer 2017 (4.10) release, I am happy to highlight several new and enhanced features on the SnapLogic Enterprise Integration Cloud platform. Through these platform enhancements, we are enabling our users to further automate their processes for greater productivity.  

Local Dashboard for Ultra Pipelines

We have added a dashboard that can be accessed at the node level. While all users may leverage this ability for Groundplexes, it is largely significant for those who are running Ultra pipelines. Ultra pipelines run in a low-latency manner and are always running to provide real-time data delivery. The Local Dashboard will now give users visibility into what pipelines are executing during maintenance windows and for any other scenario where a customer may not be able to access the SnapLogic cloud dashboard. This will give users confidence that those pipelines are running as expected.  More details about Ultra pipelines may be found on the blog as well as the documentation site.

Figure 1: Users can visualize Ultra pipelines within the Local dashboard.

Watch the Local Dashboard Demo Here.

Project-Level Migration

Today, users have a varying level of workflows for migrating projects from one environment to another. Some users leverage the Public API for Import and Export where other customers customize their migration using the SnapLogic Metadata Snap. In this release, we are adding support for project-level migration through the UI. This behavior follows the functionality that is available for Import/Export API except it is now available through the UI. This will help users who may be using the Public API or even through a manual UI Import/Export to move projects between organizations.

Figure 2: Users can perform migrations between organizations within the UI.

Watch the Project-Level Migration Demo Here.

Notifications & Alerts

Previously, we added system alerts to notify users of system-related activities, such as pipeline queuing. In this release, we added the ability to set-up email notifications to help users better track and monitor usage of the system. These early warning notifications include the following:

  • User Creation: Be notified whenever a user is added or removed from the platform. This alert allows users to track which stakeholders are collaborating in the platform
  • Password Attempts: If users have gone through multiple password attempts, administrators will be alerted.
  • Concurrent/Max: Keep track of API usage to prioritize high-value projects. Users are alerted if they have passed their API usage.
  • Project Modification: Learn which stakeholders are collaborating and modifying projects in real-time.

Figure 3: Users can set-up notifications and alerts to track and audit system usage.

Watch the Notifications and Alerts Demo Here.

Public API for Groundplex installation and configuration

Organizations with DevOps teams require higher requirements of automation. The SnapLogic team continuously adds Public API to help DevOps professionals to further automate during their integration projects. In this release, we have made a notable milestone in the Groundplex API, allowing DevOps teams to fully automate the installation and configuration of a node. With this added capability, DevOps teams can further automate in a repeatable fashion.

For a complete list of features and functionality in our most recent release, please see the Summer 2017 release blog.  

As always, we value our users’ feedback so please share your thoughts and ideas of what you would like to see in future releases.

Partnering together for the future.
-Tim Lui, Platform Product Manager

Tim Lui is Platform Product Manager at SnapLogic. Follow him on LinkedIn.

Productivity killer: Disconnected data is holding workers back

By Scott Behles

A productive business is, more often than not, a successful one, and a productive employee is a happy employee. While the modern business world is, admittedly, a little more complex than this, these truisms still hold water.

However, as noted by McKinsey, productivity growth in G20 nations has been stagnating, which is no doubt causing worry for governments, businesses, and their employees.

But why is productivity slowing to a crawl? After all, we live in a time of unparalleled technological advancement, where digital transformation projects have been specifically designed to boost organizational speed and agility, and make our working lives easier and more productive.

As the second part of our study into disconnected data reveals, businesses are struggling to access and integrate enterprise data across their ever-growing number of applications and systems. Turning the systems that are meant to help us into those that, ultimately, harm productivity.

Nearly all of our respondents (98%) in the study indicated that they were involved in projects that rely on company data from multiple systems and departments, using seven different business apps and systems regularly. Those in IT use, on average, more than business users (eight compared to five), and companies in the financial services sector use even more, clocking in with an average of nine apps.

This is too much data and too many systems for man alone to handle. By overloading employees with multiple systems and not properly integrating the data, it inevitably means much of the painstaking work searching for data, completing data entry, data processing and analysis, and integration will fall to an employee or employees. This, of course, is not the quickest or most efficient means of performing this kind of task and creates a greater likelihood of error. It also detracts from more value-add and mentally stimulating work.

Shockingly, nine out of ten business users we asked said they’re involved in performing these mind-numbing tasks and, unsurprisingly, nearly two-thirds (61%) of our respondents expressed frustration that projects suffer delays caused by poor integration of data.

The impact of sub-par data integration on productivity can be significant, particularly for large businesses like those we surveyed. All in all, businesses are wasting 19 working days a year, per employee, by asking their skilled employees to perform these rote tasks, while simultaneously frustrating their workforce.

What’s likely most frustrating for employees is that they’re aware that the route to greater productivity lies in better data integration, but not enough is being done to address it.  Nearly two-thirds of our respondents stated that poor data integration practices, which are too often manual and sidestep automation, are negatively impacting productivity, and an overwhelming majority (91%) pointed to connecting data, applications and systems as an important move for their organization. In fact, over a third – likely the most heavily burdened with manual data tasks – see it as essential. Our respondents speculate that, if these data gaps were closed, they could see a 28% boost in efficiency on average.

When businesses invest in digital transformation projects to improve efficiency, productivity and competitive edge, they have to take a long term view of how new systems and apps will interact with each other. Burdening their employees with the onerous task of manually migrating data between various systems easily counteracts any productivity benefits for which these new tools were implemented in the first place, and serve to frustrate and bore their skilled employees. Remember, when all’s said and done, a productive employee is a happy employee.

To review the full results of our study, download “The Productivity Pains of Disconnected Data.

Miss the first part of our study? Read “The High Cost of Disconnected Data.”

Scott Behles is Head of Corporate Communications at SnapLogic. Follow him on Twitter @sbehles.

How SnapLogic enables faster digital transformation with Azure cloud migration

By Pavan Venkatesh

Digital transformation is a key initiative many organizations are undertaking to deliver value to their customers rapidly. This type of initiative requires fundamental organizational changes, including operational changes, culture and leadership change, innovation through the adoption of new business models, and the improvement in the experience for their overall ecosystem partners and customers.

A recent IDC report shows that “By 2018, 70% of siloed digital transformation (DX) initiatives will ultimately fail because of insufficient collaboration, integration, sourcing, or project management.” Hence, it is essential for organizations to have the appropriate set of digital tools, expertise, mindset, and integration mechanisms to achieve digital transformation.

Organizations must consider folding a cloud strategy into their digital transformation efforts that will allow them to migrate data from on-premises environments to the cloud. By migrating data onto the cloud, organizations can improve their operational agility and quick enablement.

Microsoft’s cloud-first strategy

Microsoft also has fully embraced the cloud first strategy model with SQL Server’s newest capabilities being first released to Azure SQL Database in the cloud, and later to an on-premises SQL Server Database.

In the SnapLogic Enterprise Integration Cloud Summer 2017 Release (4.10), we launched the new Azure SQL DB Snap Pack that provides abstractions to users and enables them to quickly move data from an on-premises environment to the Azure cloud.

Azure SQL DB is a relational database-as-a service using the SQL server engine underneath. It provides multi-tenancy and can scale out based on the application needs with no downtime. SnapLogic offers the abstraction layer component, called Snaps, allowing users to perform various operations on Azure SQL DB without any coding. The following Azure SQL Snaps are provided in the Summer 2017 release:

  • Azure SQL Bulk Load: The Bulk Load Snap enables users to quickly move on-premises data stored in databases like MySQL, SQL Server, or other file systems to Azure SQL DB in the cloud. It uses the BulkCopy API to stream data quickly to Azure SQL DB. This API was introduced in SQL Server JDBC v4.2 and it does not rely on BCP command line utilities. This eliminates the need to generate temporary files during the process as the data is handled in memory. It is fast!

This Snap also has the flexibility to be used in cloud or on-premises environments, regardless of the execution location.

  • Azure SQL Bulk Extract: Bulk Extract Snap allows users to move large amounts of data stored in Azure SQL DB to other downstream systems. These downstream systems can be Azure Blob, Azure Data Lake Store, Azure Data Warehouse, Redshift, or others. This Snap uses the BCP command line utility to extract data, and storing it temporarily in the local system before moving it to the designated system.
  • Azure SQL Execute: This Snap executes various SQL statements (select, insert, delete) and can be used in a pipeline to perform respective database operations.
  • Azure SQL Stored Procedure: This Snap invokes a stored procedure in the Azure SQL DB.
  • Azure SQL Table List: This Snap connects to Azure SQL DB, reads its metadata, and outputs a list of tables in a database.
  • Azure SQL Update: This Snap updates database columns associated with a table based on a given condition.

The Azure SQL Snap Pack supports two types of authentications:

  • SQL Authentication (Username and password)
  • ActiveDirectoryPassword (standard AD integration)

The following are some of the use cases where users can gain value from the Azure SQL Snap Pack:

  • On-premises database migrations (SQL Server or MySQL or Oracle) to Azure SQL DB in the cloud.
  • Data movement in Azure SQL DB to either Azure Data Lake or Redshift or other CDWs for analytics.
  • Strategically invested in Microsoft Azure cloud space or Microsoft in general.

Azure SQL sample pipelines

Below is a sample pipeline with details. The goal is to move data stored in on-premises environments such as files and SQL Server to Azure SQL DB in the cloud. Users can select an already existing schema name and table name in the Snap settings or create a new table by enabling the selection. The batch size can be tuned based on the data size and how fast users want to load data.

 

 

 

 

 

 

 

 

In the second pipeline, the data is extracted from Azure SQL DB and moved to Azure Data Lake store, so users can run analytics on top of it. More info on the Azure Data Lake can be read in my previous blog post.

 

 

Cloud strategy becomes imperative in order for organizations to move towards digital transformation, so they can achieve business agility and provide quick workforce enablement. This includes moving data from old systems stored in on-premises systems to the cloud. SnapLogic – an enterprise integration cloud platform – enables customers with the right set of Snaps like Azure SQL DB and others (over 400+ Snaps) to easily move data to the cloud and to meet digital transformation goals.

Interested in more? Watch the Azure SQL Demo here.

For a complete list of features and functionality in our most recent release, please see the Summer release blog post

Pavan Venkatesh is a Senior Product Manager at SnapLogic. Follow him on Twitter @pavankv.

Summer Release 2017: Focus on operational efficiency and artificial intelligence

By Dinesh Chandrasekhar

SnapLogic Enterprise Integration Cloud’s Summer Release for 2017 (v4.10) is generally available for customers today. We are excited to share a lot of new features and enhancements that revolve around the themes of operational efficiency for digital transformation, continuing innovation in the use of Iris Artificial Intelligence, and added support for Cloud Data Warehouse and Big Data. While we stay focused on offering one of the finest and easiest user experiences in the iPaaS space, we wanted to extend that experience over to the operational side as well.

A few key highlights from this release that focus on improving operational efficiencies and the use of Artificial Intelligence:

  • AI-Powered Pipeline Optimization: SnapLogic’s Iris technology now evaluates pipeline configuration and operation to recommend specific performance improvements based on machine learning, delivered to users via dashboard alerts and notifications.
  • Local Snaplex Dashboard: Ultra Pipelines have been used by our customers in mission-critical use cases that warrant uninterrupted data flow across multiple data sources. Now, we have introduced a local dashboard that provides full visibility into Ultra Pipelines. This offers customers increased confidence that their high-priority projects are running automatically without any interruption.
  • API Threshold Notifications: On a similar vein around dashboards, we bring in some monitoring capabilities that focus on operations and project development. Real-time alerts and notifications provide customers with an early warning system to improve monitoring of complex integrations projects. Alerts can be set up to notify stakeholders when API limits are approached and exceeded when users are added or removed from the platform, and when projects are modified.
  • Project-level Migration: We’ve been listening to your requests! With the fast pace at which integration projects are being executed and delivered across the enterprise, it is pertinent that project lifecycles are not a bottleneck. We’ve introduced a new feature that automates the migration of projects across departments and organizations, eliminating hand-coding and hard-coding, resulting in seamless project migrations.
  • Public API for Groundplex Installation: A new API is now available to automate the installation and configuration of on-premises SnapLogic nodes, eliminating the time and effort to manually download and install additional nodes. 

A few key highlights from this release that focus on enhancing our already comprehensive integration capabilities into cloud data warehousing and big data:

  • Microsoft Azure SQL Snap Pack: A brand new Snap Pack that allows customers to quickly and easily migrate on-premises databases to Microsoft Azure SQL Database for fast cloud-based reporting and analytics.
  • Teradata Select Snap: Expands the Teradata Snap Pack, allowing users to retrieve data from a Teradata database to display on a table for easy reporting tasks.
  • Parquet Writer Snap: Allows users to store data in specific partitions in Hadoop HDFS, improving the ability to create reports and derive meaningful insights from big data.
  • Parquet Reader/Writer Snap: Allows users to write Parquet files to Amazon S3 and AWS Identity & Access Management in addition to HDFS, expanding SnapLogic’s support for cloud-based data warehousing via Redshift.

Watch the Summer 2017 Release webinar to learn all the features and enhancements across our Snaps, IoT, CRM, and more.

Dinesh Chandrasekhar is Director of Product Marketing at SnapLogic. Follow him on Twitter @AppInt4All.

The bigger picture: Strategizing your data warehouse migration

By Ravi Dharnikota

If your organization is moving its data warehouse to the cloud, you can be confident you’re in good company. And if you read my last blog post about the six-step migration process, you can be even more confident that the move will go smoothly. However, don’t pull the trigger just yet. You’ve got a bit more planning to do, this time at a more strategic level.

First, let’s recap the migration process I covered in my last post, of the data warehouse itself. In that blog post, I broke down all the components of this diagram:

Data Warehouse Migration Process

Now, as you can see in the diagram below, the data warehouse migration process itself is part of a bigger picture of migration planning and strategy. Let’s take a look at the important pre-migration steps you can take help to ensure success with the migration itself.

Migration Strategy and Planning

Step 1: Define Goals and Business Case. Start the planning process with a clear picture of the business reasons for migrating your data warehouse to the cloud. Common goals include:

  • Agility in terms of both the business and the IT organization’s data warehousing projects.
  • Performance on the back end, to ensure timeliness and availability of data, and on the front end, for fast end-user query response times.
  • Growth and headroom to ease capacity planning; the elastic scalability of cloud resources mitigates this problem.
  • Cost savings on hardware, software, services, space, and utilities.
  • Labor savings from reduced needs for database administration, systems administration, scheduling and operations, and maintenance and support.

Step 2: Assess the current data warehouse architecture. If the current architecture is sound, you can plan to migrate to the cloud without redesign and restructuring. If architecturally sufficient for BI but limited for advanced analytics and big data integration, you should review and refine data models and processes as part of the migration effort. If the current architecture struggles to meet current BI requirements, plan to redesign it as you migrate to the cloud.

Step 3: Define the migration strategy. A “lift and shift” approach is tempting, but it rarely succeeds. Changes are typically needed to adapt data structures, improve processing, and ensure compatibility with the chosen cloud platform. Incremental migration is more common and usually more successful.

As I mentioned in my last blog post, a hybrid strategy is another viable option. Here, your on-premises data warehouse can remain operating as the cloud data warehouse comes online. During this transition phase, you’ll need to synchronize the data between the old on-premises data warehouse and the new one that’s in the cloud.

Step 4: Select the technology including the cloud platform you’ll migrate to, and which tools you’ll need for the migration. There are many types of tools and services that can be valuable:

  • Data integration tools are used to build or rebuild ETL processes to populate the data warehouse. Integration platform as a service (iPaaS) technology is especially well suited for ETL migration.
  • Data warehouse automation tools like WhereScape can be used to deconstruct legacy ETL, reverse engineer and redesign ETL processes, and regenerate ETL processes without the need to reconstruct data mappings and transformation logic.
  • Data virtualization tools such as Denodo provide a virtual layer of data views to support queries that are independent of storage location and adaptable to changing data structures.
  • System integrators and service providers like Atmosera can be helpful when manual effort is needed to extract data mappings and transformation logic that is buried in code.

Using these tools and services individually or in combination can make a remarkable difference in your data warehouse, serving to speed and de-risk the migration process.

Step 5: Migrate and operationalize; start by defining test and acceptance criteria. Plan the testing, then execute the migration process to move schema, data, and processing. Execute the test plan and, when successful, operationalize the cloud data warehouse and migrate users and applications.

Learn more at SnapLogic’s upcoming webinar

To get the full story on data warehouse cloud migration, join me for an informative SnapLogic webinar, “Traditional Data Warehousing is Dead: How digital enterprises are scaling their data to infinity and beyond in the Cloud,” on Wednesday, August 16 at 9:00am PT. I’ll be presenting with Dave Wells, Leader of Data Management Practice, Eckerson Group, and highlighting tangible business benefits that your organization can achieve by moving your data to the cloud. You’ll learn:

  • Practical best practices, key technologies to consider, and case studies to get you started
  • The potential pitfalls of “cloud-washed” legacy data integration solutions
  • Cloud data warehousing market trends
  • How SnapLogic’s Enterprise Integration Cloud delivers up to a 10X improvement in the speed and ease of data integration

Register today!

Ravi Dharnikota is Chief Enterprise Architect at SnapLogic. Follow him on Twitter @rdharn1

SnapLogic seeks partners to help drive LATAM growth

By Carlos Hernandez Saca

I am very excited to announce that SnapLogic is now seeking Latin America (LATAM) - Mexico, Brazil, Argentina, Colombia and Chile – partners to support our global expansion. This news follows our successful launch of SnapLogic’s global/US partner program, SnapLogic Partner Connect, just over a year ago.

We’ve got momentum. We’ve seen first-hand that SnapLogic dramatically transforms how companies integrate data, applications, and devices for digital business. In December 2016, SnapLogic secured an additional $40 million in funding to accelerate its global expansion. And we see the channel as an important driver of this expansion.

We’re making this call for partners to help us build a rich ecosystem of technology, consulting, and reseller partners in the LATAM market so that joint customers see the benefits of adopting software-as-a-service (SaaS) applications, cloud data management, and big data technologies. SnapLogic’s technology enables this by accelerating integration and driving digital transformation in the enterprise.

The SnapLogic Partner Connect program includes several categories, reflecting the breadth of the SnapLogic platform and depth of adjacent partner solutions. Partner types include:

  • Technology Partners– These are providers of technologies with which SnapLogic integrates, and includes partners such as Salesforce, Workday, AWS, Microsoft, Cloudera, and Snowflake.
  • OEM/Managed Services Partners- OEM partners integrate SnapLogic into their product or service offerings. OEM partners include Manthan, Reltio, Planview, and Verizon.
  • Global Systems Integrator Partners– SnapLogic’s GSI partners are implementing SnapLogic as part of comprehensive digital transformation projects at some of the world’s leading enterprises. These partners include firms such as Accenture, PwC, Cognizant, HCL, Tata Consultancy Services, and Tech Mahindra.
  • Consulting Partners– These are organizations that provide integration consulting and best practices services, and include partners such as Interworks, Softtek, and Semantix Brazil.
  • Reseller Partners– The roster of SnapLogic international resellers is growing rapidly and increasing SnapLogic’s presence in new markets worldwide. Resellers include MatrixIT in Israel, KPC in France, Sogeti in United Kingdom, Systemation in the Netherlands, Softtek in Mexico, DRZ and Semantix in Brazil.

SnapLogic CEO Gaurav Dhillon says, “Extending our partners program into the LATAM channel is a crucial part of our business growth strategy. Working alongside strategic channel partners will provide an exciting opportunity to reach new LATAM customers and help them realize their digital transformation goals through connected data and applications.”

And I agree. The market opportunity for data, application, and device integration is large and growing, representing a unique opportunity for our partners in LATAM. As customers across the region increasingly use our modern, cloud-first platform to integrate and manage data across cloud, big data, IoT, mobile, social, and more, we look forward to working with new partners in LATAM who can help accelerate their digital transformation efforts.

SnapLogic’s self-service integration platform was recognized as a Leader in Gartner’s 2017 Magic Quadrant for Enterprise Integration Platform as a Service (iPaaS). Global enterprise customers such as Adobe, AstraZeneca, Box, Capital One, GameStop, Verizon and Wendy’s rely on SnapLogic to automate business processes, accelerate analytics and drive digital transformation. If you’re interested in partnering with us, we want to hear from you.

Carlos Hernandez Saca is the Sales & Partnership Area Director for Latin America at SnapLogic. You can follow him on Twitter @rivaldo71.