Showing posts with label traditional database vs big data testing. Show all posts
Showing posts with label traditional database vs big data testing. Show all posts

Friday, 10 May 2019

What is Big Data Testing? Big Data Testing Tools and Types | W3Softech

Big Data Testing:

Big Data Testing is the process of testing applications which contains Big Data. Here, Big Data in the sense collection of large data sets that are too hard to handle by traditional data computing applications. Datasets involve a wide range of tools, techniques and frameworks to process the application testing. Performance Testing and Functional Testing are key elements of Big Data Testing.
In the process of this testing, testers need to verify the processing of terabytes of data using supportive components. It involves checking various characteristics such as accuracy, conformity, consistency, data completeness, duplication, validity, etc.,

Big Data Testing is divided into three steps

Step 1: Data Staging Validation

  • In the first step, a large amount of data should be validated from a wide range of sources like RDBMS, Social Media, Weblogs, etc., to ensure that data is correctly pulled into the system
  • It compares the data pushed into the Hadoop with the source data to ensure that they both are matching
  • It helps to verify the data which is extracted and pushed into correct HDFS location

Step 2: MapReduce Validation

In the second step, QA engineers or testers need to verify the business logic validation among every node and need to validate them after running over multiple nodes. Here MapReduce validation works based on Map procedure which performs filtering and sorting whereas Reduce procedure performs a summary operation
  • It ensures that application process works properly
  • Implementing the data based on data aggregation rules
  • Make sure validating the data after the process of MapReduce

Step 3: Output Validation Phase

The third step in big data testing is the output validation phase. In this final step, the output files are created and moved to a Data Warehouse system or to any other system depending on requirements
  • It helps to check whether the transformation rules applied correctly or not
  • It validates the data integrity and data load into the system
  • Helps to ensure the data free from corruption by comparing the HDFS system data with target data

Difference between Traditional Database Testing and Big Data Testing:
PropertiesTraditional Database TestingBig Data Testing
DataHere tester able to work with structured dataHere tester able to work with structured and unstructured data
ApproachIn this type, the testing approach is well defined and time-testedHere testing approach requires focused R&D efforts
InfrastructureAs the system size is limited there is no need for any special test environmentIt just requires a special test environment as it contains large datasets usually in terms of TeraBytes
Validation ToolsIn these types, for system validation testers use macros or automation toolsIt uses different types of tools based on the big data cluster

Different Types of Big Data Testing Tools:

Big Data ClusterBig Data Testing Tools
MapReduceCascading, Flume, Hadoop, Hive, Kafka, MapR, Oozie, Pig, S4
NoSQLCassandra, CouchDB, HBase, MongoDB, Redis, ZooKeeper
ProcessingBigSheets, Datameer,  Mechanical Turk, R, Yahoo! Pipes
ServersEC2, Elastic, Google App Engine, Heroku
StorageHadoop Distributed File System (HDFS), S3