We would like to Welcome Mohan Sadasivam as our esteemed speaker for Global Testing Retreat #ATAGTR2019
Mohan Sadasivam has done B.E(ECE).,M.Tech-Information and Communication Technology from Anna University Chennai. Having 7 Years of Experience in IT Industry.
He is Technical Lead (Offshore) in TCS Chennai. Technological Innovations in Data Analytics and its Infrastructure are his Inspiration.
He is highly Aspiring about Business Optimization by Functional & Performance testing on Bigdata Frameworks like Hadoop.
He have published more than 10 research papers which are mainly focused on Cloud Computing and BigData.
Mohan is our esteemed speaker for #ATAGTR2019.
Mohan will be taking a hands on Lab Session on “Benchmarking Hadoop by its Functional/Performance Testing and Leveraging to Current Business Expectations.” – Track 4 on Day 1
The key feature of Bigdata is ‘tremendous growth(volume) of data in high speed(velocity)’ particularly Unstructured data. The performance of data processing and computation is indirectly proportional to Bigdata features (Volume, Velocity & Variety). Though Hadoop was powerful framework to deal Bigdata, nowadays many Business Intelligence (BI) and Analytics departments are facing challenges with Hadoop Eco-systems because of its storage and processing architecture.
As Hadoop is a Cluster, Parallel Computing & Distributed Storage/Computing, the performance depends on lot of factors like Networks, Storage Disks, Memory, CPU etc. To get more value from Hadoop in Data Warehouse and Data Analytics use cases, many users are modernizing certain aspects of their Implementation. For example, to lower the cost and simplify the maintenance of Hadoop clusters, users are migrating Hadoop clusters from their premises to different cloud environment. Similarly, In an EMC Isilon Hadoop deployment, the HDFS is integrated as a protocol into the Isilon distributed OneFS operating system. This approach gives users direct access through the HDFS to data stored on the Isilon cluster using standard protocols such as SMB, NFS, HTTP, and FTP.
In this presentation (a) We are covering different testing and benchmarking methodologies to ensure the quality in-terms of data storage and processing in Hadoop Environment. (b) As the native Execution engine (MapReduce) of Hadoop has latency in process, how can we leverage Hadoop in Current Data Technology. (c) How we can leverage and Integrate Hadoop with Advanced Data science frameworks to have better outcomes in predictive Analytics.
We had posted some questions to Mohan as a part of his #ATAMyStory
1. Why should someone choose testing as a career? Or What does testing really mean to you?
One can become developer by learning one Language but ensuring the quality against customer requirements and specifications is Challenging.
No software/Service exists without mistake, so called mistake is Bug! The elimination of bugs from the software framework depends upon the efficiency of testing.
A Software testing and Quality assurance is nothing but well planned and systematic approach to evaluate the quality. Apart from standard definition, Lifecycle, procedure and Tools for Testing, a tested should build internal spark toward testing to make a mark in testing career.
2. While practice makes all of us perfect, share an everyday practice that has made things better for you at work
1. While starting my day, I will update myself by reading all the mails.
2. I keep on update my product knowledge by reading the Release documents of Products from Vendor.
3. I used to note down the problems and its solutions to avoid in Future.
3. The most challenging bug or issue that I have found and learnings from it ? Or The most challenging testing task that I have done and how I accomplished the same
1. While doing kafka rebalancing (after broker decommission or commission) faced issue on uneven rebalance since Kafka is still not perfect in rebalancing. Manual intervention is needed to make even data distribution in all the disks.
2. Have to test and monitor the skew factor in Teradata to have better performance in the development.
4. Anything else you want to share
MapReduce is inefficient in Huge data processing in Hadoop, Testing and Analyzing what are all the modifications can be done to avoid multiple I/O in MapReduce programming methodologies.
#ATAGTR2019 is one of the largest, most fun filled and learning filled global testing conference. #ATAGTR2019 is back again in its 4th edition with more fun and more learnings than in the past. The conference is scheduled on 14th and 15th December in Pune.
- 2 Days
- 2 Panel Discussions
- 4 Keynotes
- 5 Skits and Games
- 38 Interactive Sessions
- 18+ Labs
Loads of fun, competitions and much much more.
Day 1 has 5 tracks and Day 2 has 6 tracks.
Focusing on interactive sessions, labs/workshops, skit performances and fun activities and quizzes and much much more.
The conference scale as is evident has increased substantially. We hope that everyone can be part of this most fun filled and one of the largest global testing conference.
To know more about the conference, click here or on the image below