Category: Spark
-
Covid-19 Analysis with BigData Applications – Part 3
Hi there! If you’ve been following these blog series, we are looking at a BigData project to analyze Covid-19 data. So far, we have looked at the overall architecture and ETL Spark jobs. In this post, let’s look the scheduler component (Lambda function) in this workflow. The main reason I’m using Lambda here is due…
-
Covid-19 Analysis with BigData Applications – Part 2
Hi again! On this post, I’ll explain on the second two ETL jobs: first one to process the Twitter data related to Covid-19 and second one will combine the data from previous two ETL jobs. As we have already covered the basic EMR concept earlier, I’ll directly get into the explanation of what is being…
-
Covid-19 Analysis with BigData Applications – Part 1
Hi again! So, if you came here after reading the introduction post, we’ll be talking about the part that we run on EMR cluster i.e. ETL job. Across BigData community, the term ETL generally refers to Extract, Transform and Load. And for this project, I’m using Apache Spark, which happens to be one of the…
-
Covid-19 Analysis with BigData Applications – Introduction
Hi there! As we know it, Covid-19 pandemic has been the prime highlight of this year. It has been directly affecting our way of living and has also made some serious dents to global social-economic dynamics. And there are lots of people, who have been continually working to understand, fight and control this epidemic. So,…