Articles

Test Data Management with no hassle?

Yes, and we’ve got the solution…

Managing your test data sounds like a pain in the neck and most Development & QA teams depend on experts – mainly DBAs – to do the work for them. This is logical, since it is data and as such perceived to be complex, sensitive, never ending. The number of descriptors generally applied on and on.  

Yes, data is the heart of the business and its business applications. But most organizations find it hard to keep on supplying relevant data to their Dev and QA teams due to their perception of its complexity. In fact, that’s a reality which jeopardizes the business. When the software development cycle is longer, its quality is poorer due to software tests being conducted on generally irrelevant data or on old data that has lost relevance to current production data.

Accelario solves this problem with a focused to-the-point software solution that allows Dev and QA teams to build, refresh, synchronize and secure their own Test Data. Unlike other TDM (Test Data Management) solutions, Accelario is based on the following principles that make it happen.

No need for a data expert

Accelario is a self-service software. No need to be a data expert at all. QA or Test managers can operate the software themselves, enjoying better performance and results than 80% of the DBAs community. All decisions on how to carry out the data build, refresh and other actions are performed autonomously by the software, unlike other TDM solutions that require expertise in data to operate their solutions.

Best of all, implementation is simple and straightforward – few clicks and your Accelario server is ready, either installed on premise or on the cloud. Other TDM solutions need a lengthy implementation time frame and usually require a unique server, storage and more hardware. That inherently makes them more complex and costlier.

Masking is done on the fly – production data is secured all the time

Masking is a non-issue for Accelario. It’s done on the fly as part of the build or refresh of your database, has a 2% effect on performance, and no production data is exposed since the masking is done in the “select” data stage. This means data is secured in transit as well. Some TDM solutions cannot mask data at all (physical, storage based solution) whereas others can mask the data but only in an “in-place” operation on the database after the cop. This leaves production data exposed in transit as well as on the target server.

Synthetic data? Easy

Synthetic data is built in record time, to the tune of billions of records per hour. Yes, faster than any other solution. The build maintains high statistical relevance to the production data since Accelario employs the metadata and the production data structure to do so. The build can be handled by anyone. No data expert is needed. Other TDM solutions don’t have a synthetic data solution and customers need to add software to make the solution whole.

One click to build a new environment – No extra disk space needed

One click – and Accelario visualization software lets the user make a copy of the database. The virtual copy is instantly ready. No disk space needed at all. The copy has no dependencies on the source data, its location or its capability compared to other TDM solutions which require expertise, extensive disk space, and the TDM server’s network to execute the process.

Accelario. A unique, simple to operate TDM solution. Available for on premise or public cloud use. You are welcome to contact us to get access to our 14 days demo trial.

 


TDM – STOP WAITING FOR YOUR DATA! 

Continuous Test Data Management (TDM)

Accelario is a unique Test Data Management software solution, tailored specifically for Dev & QA teams requirements. With Accelario you can accelerate test data management and cloud adoptions while safely delivering data in record time at significantly lower cost. We support leading databases such as Oracle, MSSQL, MySQL, PostgreSQL and others.

The software includes the followings core modules to enable continuous test data management:

  • Build & Refresh
  • Masking
  • Sub-setting
  • Synthetic data injector
  • Replication & Virtualization
  • Automation and DevOps pipelines

The accelario solution is self-service, it will work in an optimized way without any dependency on the operator expertise, including error-filtering of all data related issues while running.

————————————————————————————————————————————–

BUILD & REFRESH

Accelario TDM solution enables database initial load build from any source to any target (operating system, cloud infrastructure – IaaS & PaaS, database version, file system or compression technology), and later refresh of the target environment that was built with the new data.

This module is about x20 faster than any other similar technology due to two main unique & patented capabilities we have:

 

Autonomous TDM 

 

Prior to any execution, Accelario solution scans the whole database and auto-decides how to optimally copy relevant data and its objects to the target system. The Accelario solution also monitors the process during execution time and automatically detects and fixes any error. Accelario solution dynamically changes the way it works during execution, auto-adjusting to any environment or performance changes. The autonomous TDM covers the following areas in the process:

    • Prior to execution – includes prioritization and parallelization
    • Execution – includes automation error recovery and process dynamic behavior
    • Summary – automatically compare and resolve missing parts

 

Performance algorithm

 

An Accelario unique algorithm can parallelize all copy processes including challenging objects such as Lob, Clob, Large Tables – which divide those objects into smaller parts and copy them in parallel.

Both Features, the Autonomous TDB and Performance, manage the work before and during the Build or Refresh – affect the methodology and technology of the copy action during runtime to get optimized results.

Example 1: A table copy that stops in the middle, will re-submit form point of failure and will not copy the whole table again.

Example 2: If the network connection varies during runtime, the software will automatically change the objects division as per the new situation.

————————————————————————————————————————————–

MASKING

The Accelario Solution supports different masking methods like Substitution, Shuffling and Tokenization. It also allows the user to decide on the masking methods and data to be masked. This module supports:

  • In place masking – operate the action on the database server itself
  • On the fly masking – as part of the BUILD & REFRESH we mentioned before

The masking is done in the SELECT stage of the data on the source, in such a way that no sensitive data is exposed outside the source. This operation adds only up to 2% of Load on the process as a whole, so actually does not affect the process time and performance.

————————————————————————————————————————————–

SUB-SETTING

The Accelario solution gives the user the option to copy any set of data that is not the full database:

  • Specific tables
  • Specific user
  • Between dates
  • Unique select & where commands

————————————————————————————————————————————–

SYNTHETIC DATA INJECTOR

Accelario’s synthetic data injector is about x10 faster than any other system in the market. Actually, we inject (insert) data in record time, similar to data select speed – Billions of records in one-hour.

High statistical probability of created data correlation to source data and configuration

  • Copy of metadata from source
  • Collect source data statistics and build the same on the target (with a new created data)

————————————————————————————————————————————–

REPLICATION & VIRTUALIZATION

Accelario enable a unique solution to synchronize test environment securely from production, with an unlimited number of test environments to build;

  • Using our unique solution, a test copy of the production database is built (Golden Image), with masked data as per the organization policies.
  • Once ready, Accelario software can create a virtual copy (Image) of the database in a click of a button, in minutes and without almost any additional disk space or system resources.
  • The Golden Image is synchronized with the source the whole time, allowing building Images relevant to a certain point in time.

With this unique solution, we enjoy a test environment synchronized with production databases all the time. It is also secured (masked), for every developer or QA, while saving time and storage space with no dependency of database size.  

————————————————————————————————————————————–

AUTOMATION & DEVOPS PIPELINES

Accelario solution provides API to auto connect to CI/CD process and tools (Jenkins, Chef.io and others) to allow automation of the data build and refresh processes.


Working From Home challenges and best practices, DevOps & QA teams

During these hectic days, constant changes and the Coronavirus world-wide situation we need to adjust quickly to the challenges we are facing.

Working From Home (WFH) is a fact now and as such, requires adjustments to keep executing our software Developing, Quality Assurance and Testing jobs.

One of the common solution is to build Dev/Test/QA environments quickly and securely on Public Clouds and allow professional teams to continue work as they were in their own organization network, connected to their organization data center.

Two main challenges evolved, both are resolvable with TDM (Test Data Management) basic practices – (1) Masking and (2) Sub-setting.

Challenge 1 – Protect production sensitive data

Masking data allows the organization to extract data from Production into Dev/Test/QA environments without any danger of exposing the sensitive data on a less secured environments. Masking data on-the-fly do the actual masking at the time of “select” of the data, in such way that data in transit as also masked, and all data leaving the organization network and going to the public cloud is protected. Masking methods are vary as per the organization decision – Substitution, Shuffling, Tokenization and more.

Challenge 2 – Avoiding massive storage, disk space and long data-load on the cloud

The ability to copy only the data needed for the Tes/Dev/QA environment becomes critical when moving into the cloud – not only for reasons of the cost of storage and disks but also for the time it takes to upload data into the cloud (sometimes x100 SLOWER then what we are used to inside out organization network).

Sub-setting means selecting only the data needed for the current development task or a specific test; a common solution and also the easiest one to solve this challenge. Choosing only the data needed, between range of dates, specific tables, specific user, specific rows, etc., allows a quick and easy build of the right dataset to the right environment.

 


What Are the Different Types of Database Migration?

 

If you are working in the digital industry, then you probably have heard of database migration. It pertains to the process of moving data between two or more types of storage. The transfer could take place between different computer systems or formats. In other words, it is the process of moving data from one platform to another. 

There are different reasons why data is moved from one platform to another such as saving money, organization, and efficiency. The process involves different phases and iterations that must be assessed based on business needs. Here is a detailed explanation of different types of data migration and the reasons behind each type. 

 

aws database migration service

Reasons to Why Data Migration is Done

  • Cost-efficiency 

Financial savings are one of the top reasons why database migration is done. Usually, moving from an on-premise database to a cloud database could cut infrastructure and manpower expenses.

 

  • Modernization 

The migration of databases is primarily done to move from an outdated system/legacy system to a modern system – one that meets the needs of modern data. The world is constantly evolving and so is technology. Hence, there is a necessity for a new storage method. 

 

  • One source of truth 

Data migration is also done in order to have all data available in one accessible source for all divisions of the organization. It usually takes place after an acquisition, specifically if the systems need to be combined. It also takes place if the different systems are siloed throughout the organization. 

what is data migration

What is database migration in SQL Server?

Migrating data in SQL is done for different reasons. The process, data is being moved to the different instances of SQL and restoring databases from a backup, to name a few. There are two ways of moving data in the SQL server – manual and command line. The method of choice depends on the type of task you need to accomplish. The manual method is ideal if you are going to move a few databases. However, if you are going to move mass databases, the best method is a quick and secure automated process. The command line may need a lot of preparation, but it is a preferable approach.  

 

What are the different types of data migration?

The migration of data can be done in many different ways. There are four primary types of data migration and these are the following:

 

  1. Storage migration – As the name implies, the physical blocks of data are moved from one type of hardware to another.

 

  1. Data migration – The entire database is moved from one vendor to another or the current software being used for databases is upgraded.

 

  1. Application migration – The need for transformation is a must if the application vendor has to be changed. It is a must considering that every application operates on a particular data model. 

 

  1. Business process migration – It pertains to the business practices of a company, specifically, business management tools that need to be replaced or upgraded. It usually happens in the event of a merger or acquisition. The data transfer is needed for anything from one business, database, or application to another. 

 

What is database migration in MySQL?

There is a need to move MySQL databases for varying reasons. It could be the need to transfer data to a testing server or to move the entire database to a new production server. With migrating data in MySQL, a robust database migration can be achieved. The migration process is easy too as it would only require a few simple steps. However, the entire process duration may take some time depending on the bulk of data that needs to be migrated. Luckily, Accelario’s MySQL Data Migration system supports minimal downtime during the process.

 

There are also instances when you need to migrate data between two MySQL servers such as when separating database for doing reports, cloning a database for testing, and when completely migrating databases to a new server. Generally, you would need to back up the data on the first server. After securing the data, you will proceed with transferring the data remotely to the new destination server and restore the backup on the new MySQL. 

aws data migration service

How to migrate a database to AWS? 

There are times when databases have to be moved to AWS (Amazon Web Service). The migration to AWS is a sound decision as AWS has an impressive portfolio – high performance, fully manageable, and cost-effective. 

 

  • It will improve the performance at scale. It is designed for speedy and interactive query performance at any given scale. Many organizations that use AWS experience 3 to 5 times more performance when compared with other popular alternatives. 

 

  • It makes the data fully manageable. By migrating a database to AWS, you will break free from the complicated process of tending databases and data warehouse administration. 

 

  • It is a cost-effective approach. It provides availability, security, and reliability. 

 

  • It is completely reliable. It is built to cater to critical business workloads. It does not only safeguard the data but also secures enterprise applications. 

 

There is a growing demand for an effective and secure time-saving process. With the AWS solution, it is easy to move and deploy in the cloud with scalability, readily availability, and high performance. 

 

5 Steps for AWS Database Migration

  1. The process starts with a replication instance.
  2. Connect the tool to the source and to the target bases.
  3. Choose databases, tables, and or schemas.
  4. The next step is the actual AWS data migration service. Allow AWS to load data and keep the data in sync.
  5. Once in sync, the applications are switched over to the target.

 

The ease of use and the ability to secure the data are highly preferred while choosing your data migration tool.

The majority of businesses with critical database workloads choose AWS migration.

 


About our data operation and migration

Accelario delivers an agile platform for establishing, refreshing and migrating databases.

Whether it’s advanced analytics, application development, or cloud migration, being able to
quickly populate a database is a challenge requiring lots data engineering effort, which limits the agility and effectiveness of these initiatives.

Using Accelario, data can be quickly migrated to new and refreshed databases, without requiring specialized data engineering skills. Data masking, sub-setting, etc. can be managed
through a simple user interface, instead of requiring days of custom ETL programming, freeing developers, data scientists and IT personnel to focus on business-related challenges rather data-related ones.

Key Use Cases

  • DevOps – allowing dev, QA teams to quickly create test data in a self-service mode (Test Data Management).
  • DataOps – enabling data scientists and business analysts to setup modeling and back testing, data sets, while ensuring data privacy and protection.
  • Cloud Migration – facilitating vendor-independent data migration to and from different cloud providers.

What is Database Migration

Think about data migration as birds migration – the necessity for resources. Birds migrate from an area with limited resources to areas abundant with food and ideal nesting conditions. How is that relevant to data? When data migration relates to enterprise applications, companies move data from one platform to another for resources that are critical for business operations. 

Out of the many reasons for database migration, the most common, financial reason, is to save money with a cloud-based database. The need for new technical features also strengthens

operations necessary for competitive advantage. Simply, outdated data supporting platforms may not work with modern applications and marketing automation tools. We will go over the importance, processes, challenges, and solutions of data migration. 

Why is database migration important?

On-premise vs cloud-based database  

Think about your customers’ behavior – web browsing for the best deal, comparing reviews, or completing an online form. All of these behavior patterns and research methods are supported by cloud-based migration. The most-known, multinational tech companies such as Amazon, Google and Apple all involve cloud data-based operations. Data operating on the cloud meets consumers’ needs and habits with reliability, availability and security. 

Cost optimization 

Moving from on-premises to a cloud database is cost-effective. Companies move to save money on infrastructure and required skills (manpower) for on-premises database management. Many businesses understand the cost-savings of cloud environments as the main reasons for cloud-based migrations. 

Software development

Remember the modern consumer habits we discussed earlier?  In this era of big data, new data storage techniques are key to success. For example, companies migrate from a Microsoft SQL service to cloud platforms as part of an adaptation to modern data needs. New storage techniques on the cloud provide a solid ground along with operational flexibility. In addition, with cloud migration, companies can quickly move new and existing applications straight to the customers’ screens. 

The core of operations

Moving data to a cloud platform provides access to all divisions of the company. Especially when multiple channels are combined in the business model, a one-stop data source is highly efficient for multi-channel operations. With strategic planning, data is delivered applicably to the different teams: IT, marketing, or management. Optimized data flow from a one-sourced data channel fuels the operations of each department while providing insights for the management. 

Scale-up

For multinational organizations and small startups, on-cloud data provides an opportunity for growth. Customers are used to quick accessibility. When a company has access to behavior analysis, all divisions can drive growth.  Also in many countries, data protection is a nationally regulated procedure that should comply with the local regulations. While using third-party data migration support, companies don’t need to worry about regulations that are covered by the cloud platform.

What are the different types of data migration?

There are four main data migration types: storage, data, application and business processes. 

Storage migration

Exactly as storage sounds – it’s the outdated, physical hardware that stores the data. To enhance flexibility and save on operational costs, companies move data from a physical location to modern cloud systems.

Database migration 

Based on business needs, databases can be moved completely from one vendor to another. This also takes place with software updates or from one cloud platform to another. 

Application Migration

The migration of applications to modern environments is performed from an on-premise environment to an on-cloud application or from one cloud platform to another. 

Business migration

Business processes migration usually takes place during management changes and acquisitions. During such processes, there’s a need for new tools and movement of data to be accessible for new divisions. 

How does application data migration work? 

Clean data migration phases include planning, file allocation, migration, and testing. 

Data profiling 

The initial stage is data profiling and resource assessment. During the process of data profiling companies allocate data location and schema. After finding data in the existing location (database or files,) data should be organized in informative and optimized patterns. In addition, planning takes into consideration the cost and requirements that will be involved. Integrating data into the new platform may involve risks, which are solved with secure transformation methods.

Files migration

Data migration applies to all of your environments. When migrating small or large batches of files, the process should be simple with minimal cost and downtime that can affect business operations. 

Schema conversion  

Database schema refers to an organization of data in an organization – a blueprint of database hierarchy and relationship between different data groups. While moving data from one source to another one, you need to ensure there is a compatibility between the new platform to support the old blueprint.

Data migration operational suite 

After you have ensured that data schema is supported, you can start the actual migration. Without expertise, this process may be complicated. Luckily data migration and operation can take place with fast, self-service applications. There’s an opportunity to quickly migrate and deliver data to new or refreshed databases with minimal downtime and without specialized engineering skills. 

Testing 

Once your DevOps or someone from the QA team has moved the data, testing is necessary to ensure that no valuable information is missing. During the testing phase, you can setup modeling and backtesting data sets, while ensuring data privacy and protection. Testing data transformation can be a lengthy process. Especially because during the migration stage, data is masked and encrypted to avoid sensitive customers’ data breaches. A new automated tool gives a fresh perspective to data masking and testing with data refreshment and duplication.

3 Database Migration Challenges 

Most digital companies require information systems to support business processes. Secure data migration is a critical element when a company decides to improve operational flow with the new data platform. Moving data from one storage to an online destination without affecting operations sounds easy, but without proper technical support, this process can crush a company of any size. Proper planning is the answer to all data migration challenges.

Databases allocation

For many companies, multinational operations are common. Your company’s headquarters may be in the US, while your IT and customer support are in another state or country. Allocating the operational data storage map is the first (and key) step to successful data migration. At a larger scale when acquisitions take place, companies need to define data migration schema from various geographic areas. 

Data loss and corruption 

Fast data migration is crucial, but the speed must be supported with advanced security with features such as data masking and subsetting. Tested third-party data migration software is crucial for a corruption-free process without valuable data loss. 

Secure data protection 

It’s not a secret that there is an abundance of trolls that would love to get their hands on your data for a decent reward. When moving data from one source to another, it’s critical to secure the data. You can encrypt or mask all personal and financial information during data migration. 

How can Accelario help migrate a database? 

When transferring data, a quick and cost-effective solution can simplify the migration project delivery without specialized and costly expertise. Accelario offers data migration and operation suite, designed to safely deliver data with minimal cost and downtime. First of all, Accelario’s automated solution is easy to use and doesn’t involve code work. This reduces operational cost and time because anyone can ideally migrate, analyze and test with secured cloud adaptation. 

Accelario’s solution simplifies, accelerates and ensures secure data migration. Our solution was successfully implemented by clients from various industries, including banking, TV, and software. We currently support on-premises data migration to the cloud platforms while focusing on data Oracle and SQL data operations and migration on Amazon Web Service (AWS) marketplace. 

Microsoft SQL Migration 

Data migration can truly benefit your customers with a fast platform to support their shopping and service needs. Accelartio’s SQL migration automated tool has been proven to dramatically reduce costs and risks while supporting minimal downtime without sacrificing performance. We assist in most secure migration with rapid copying or refreshing data between production and test environments. 

Transferring SQL databases to AWS has been proven with high-performance, reliable, cost-effective, cloud computing operations. Amazon EC2 running Microsoft Windows Server is a fast and dependable environment for deploying applications using the Microsoft Web Platform. We can help you with application hosting, website and web-service hosting, data processing, media transcoding, and distributed testing during your data migration. 

Oracle Data Operation and Migration 

At Accelario, we can help with PaaS and DBaaS utilization. Our rapid copying and refreshing tool between productions and test environments is highly beneficial for your DevOps and QA professionals. Cross-platform features cloud and on-premises operation in any direction with minimal downtime. This solution is much faster and secure than any other solution due to privacy features such as data masking and subsetting. Accelario’s Data Operation Suite saves up to 35% of data migration cost and reduces 80% of data migration time. 

Now, you have learned all about your stored database migration to your cloud destination. Microsoft SQL and Oracle databases feature different configurations. Accelario supports both foundations with fast, secure and easy to use data migration solutions. It’s time to scale your business with on-the-cloud data operations. 

 


Our database zero downtime migration with AWS DMS

Database zero downtime migration with AWS Database Migration Service and Accelario

AWS BLOG

This is a guest post by Michael Litner, Co-founder and VP R&D of Accelario.

Accelario is database migration software that provides a fast and easy way to load an Oracle database to Amazon Web Services (AWS). At the end of the initial load, database synchronization starts immediately using AWS Database Migration Service (AWS DMS), and the result is a zero downtime migration of your database.

Businesses that require 24/7 operation face a significant issue when moving databases into the cloud: Up until now, few if any cost-effective options existed to minimize the resulting downtime. With its recent integration with AWS DMS, Accelario provides a ready-to-use, zero downtime, full database migration solution—meaning that the whole database is migrated, including users, procedures, views, etc. When the process is done, you can access your database and start using it with your application immediately.

Another significant issue affecting such businesses is ensuring that sensitive information is not exposed as part of the process, which is critical for complying with data protection policies and regulations. You can achieve encryption of data in transit by using Oracle’s built-in features (including network data encryption)—also supported in Amazon Relational Database Service (Amazon RDS).

In this post, I go over how to use this combined solution to migrate a database to either Amazon EC2 or Amazon RDS (also using data masking). I also show you how to easily refresh a database after it’s in the cloud.

 

READ the full article –  AWS blog


 

WATCH – Cloud migration on premise to AWS

With Accelario, IT managers can now plan their migration in only a few minutes and with a maximum of control over the entire process.


Call Now Button