I am an experienced software QA engineer with over six years of experience in the software industry. I hold a Masters's degree in Computer Science from The University of Iowa and have extensive experience in manual and automated testing.
My expertise in automation testing tools such as Selenium, Cypress, Playwright, and JMeter has helped me develop and maintain a robust suite of automated test cases that ensure the quality and reliability of our products. Additionally, I have experience with performance testing, load testing, and stress testing using JMeter and other tools.
I am well-versed in agile software development methodologies and have worked closely with development teams to ensure our testing efforts align with project timelines and goals.
Moreover, my knowledge of SQL and database technologies enable me to validate the integrity of data stored in our products effectively. I am also familiar with data extraction, web scraping, data wrangling, and data acquisition techniques that enable me to collect, transform, and publish data accurately and efficiently.
Beyond my technical skills, I am an avid learner who constantly seeks new opportunities to improve my knowledge and skills. I have attended several training sessions, workshops, and industry conferences that have helped me stay current with the latest developments in the software industry.
Outside of work, I am passionate about staying up-to-date with the latest trends and developments, and I enjoy participating in online communities and events to learn from and connect with other professionals.
- Programming and Script Language: Javascript, Python, R, SQL, Bash, Java, HTML, CSS
- Automation Testing: Selenium, SeleniumBase, Cypress, JMeter, and Playwright
- Working with Different Data Types: JSON, CSV, EXCEL, Text, XML, SQL, Parquet, Avro, ORC
- Version Controlling, Container Virtualization: Docker
- Databases: Postgres, SQL Server, MySQL, SQLite, and MongoDB
- ETL Database Model Development: Carry out new procedures and create various data warehouses
- Data Warehouse, Data Lakes, Data Pipelines, Automation
- Gather Requirements from Business Analysts
- Develop Physical Data Models Using Erwin
- Create DDL Scripts to Design Database Schema and Database Objects
- Cloud Computing: AWS, Microsoft Azure AZ-900, Microsoft Azure DP-900, Microsoft Azure AI-900
- Perform Database Defragmentation and Optimize SQL Queries
- Improving Database Performance and Loading Speed
- Framework: PySpark, and Hadoop
- Data Visualization: Tableau
- Operating Systems: Windows, Linux (macOS, Ubuntu, Redhat)
🔹Fun fact 👉 01000011 01101111 01100100 01101001 01101110 01100111 00100000