Wednesday, February 21, 2018
Home » Big Data » How Big Data is Factoring Into the Upcoming Solar Eclipse

How Big Data is Factoring Into the Upcoming Solar Eclipse

Monday, August 21 is going to be a special day in American history. We’ll experience a solar eclipse for the first time in nearly 40 years. The moon will pass between the sun and the Earth, blocking part or all of the sun for up to about three hours.

Since this event is such a rarity, there’s a lot of hype around it. People are traveling to areas that will see a total eclipse. Special viewing glasses are flying off the shelves. Even Bonnie Tyler, who recorded the smash hit “Total Eclipse of the Heart,” will perform a rendition of her famous song for guests on the Royal Caribbean Total Eclipse Cruise.

Though the path of totality – where the sun is completely blocked out by the moon – is only 70 miles wide, the entire country will experience a partial eclipse. Of course, different times of day will provide better viewing experiences. To figure out the prime time to see the eclipse in your area, we need to call on big data.

Fortunately, the United States Naval Observatory and NASA have analyzed the maximum obscuration percentage to calculate the trajectory of the moon across the sun. The color of the ground, the irregularities of the moon, and the lighting and angle from the sun all play a part in determining the path of totality.

As a result, anyone in the U.S. (including Alaska and Hawaii) can type in their zip code and see how much of the sun will be obscured. This interactive map also shares the closest location for viewing the total eclipse. If you’re serious about seeing the sun blocked entirely, a road trip may be in order.

In Jersey City, for example, the eclipse will peak just before 2:45 pm EDT. The moon will obscure 71.6 percent of the sun. If we wanted to see the total eclipse, we’d need to travel 572 miles southwest to Charleston, South Carolina.

Check out the video below of how NASA data visualizer Ernie Wright developed the path of totality using several tools and considering a variety of factors. In fact, NASA and its Jet Propulsion Lab (JPL) have been turning to the cloud for big data and IT infrastructure needs to do these types of analyses. NASA JPL used the AWS cloud to process and share images from Mars and were able to stream 150 TB of data in just a few hours, with over 80,000 requests per second.   Additionally, JPL has used AWS to run big data analyses for course corrections on missions. This is far beyond what they could have done using traditional IT infrastructure. The cloud helps agencies like NASA rethink big data space questions.

And remember, if you’re planning to view the eclipse, protect your eyes with special viewing glasses.

About David Lucky

David Lucky
As Datapipe’s Director of Product Management, David has unique insight into the latest product developments for private, public, and hybrid cloud platforms and a keen understanding of industry trends and their impact on business development. David writes about a wide variety of topics including security and compliance, AWS, Microsoft, and business strategy.

Check Also

Going Long: How Football is Taking Advantage of Big Data

The NFL season is back in full swing, and this year is shaping up to be the most exciting one yet. A big reason for that? The continued rise of big data within the sport.