Big Data in Space

The article “Space: The Big Data Frontier” examines the implications that big data analysis will have on the progression of astronomical discovery and exploration. Specifically, the article discusses the issues that will face researchers should the funding for the Large Synoptic Survey Telescope (LSST) project be renewed. As budget renewal does seem likely, scientists must finalize and implement big data solutions soon.


The data problems that are facing the LSST team are massive; while Kirk Borne, the Chair of Information and Statistics for LSST, makes clear that the technicalities of data storage are not an issue, their challenge will be finding meaningful information in all of the data that the LSST will collect. It’s estimated that the LSST will collect “around ten petabytes [approximately ten million gigabytes] of data per year”.


Author Mari Silbey also addresses another issue that the LSST team will face – the issue of bandwidth. While the telescope will be in Chile, its data will be transported to Illinois each day, and “[while] sufficient bandwidth for moving big data around may be available in a few select geographic regions, it certainly doesn’t exist everywhere researchers reside.” However, public and private institutions are working towards resolving the bandwidth problems, and researchers are also working to improve algorithmic analysis in order to streamline the data that is to be transferred.


Though I’m not a formal student of astronomy, I’ve always found the topic interesting; ultimately, I believe that the wealth of data that the LSST project intends to provide will be invaluable in our understanding of space.  Additionally, I think that the utilization of big data analysis will be essential in the success of this project; the Sloan Digital Sky Survey “produced roughly twenty thousand academic papers”, and the LSST will produce an amount of data equivalent to the entire Sloan Digital Sky Survey every three days. Without big data analysis to help process this massive amount of information, much of the LSST’s discoveries would remain unexplored. With big data analytics, however, we will better be able to parse and utilize the discovered information to further our scientific understanding of space.

Ford’s Utilization of Big Data Functionality

The nature of Ford’s expansion into the realm of big data is a very modern move for the company. While the enormous sums of data that are available for analysis in the market of American automotives are staggering, they are potentially a boon for anyone capable of making use of the information at hand.

Personally, what I find most interesting regarding this informatical expansion is the possible infringement on personal privacy for the sake of statistical analysis. The way onboard computers can record data not only of the more typical variety that a person thinks of when they consider what goes on when driving (fuel efficiency, distance traveled over time, average speeds, etc.) the vehicle that a person bought and uses for personal reasons could transmit personal information, such as particular routes taken, times that one travels and even the extrapolation of work schedules.

As I assume that the imperative for a corporation such as Ford is profit, I can understand their desire to make sense of any datum that can provide them with an edge on their competitors, especially since they are losing ground against other automotive corporations that are outstripping them in the technological area. As someone whose family has a vested financial interest in domestic automotives, I would say that I support Ford in their endeavors, but as an individual, I worry about the collection of information. It is possible that Ford could sell the information collected once they are done with it, which seems to be an increasingly popular course of action.

Big data seems to change the act of marketing from an art to a statistically modeled science. Personally, I worry that any project that collects massive amounts of information could be used in ways other than product improvement, but to affect the marketing campaign and even to affect costs of products, pricing them at exactly what a given demographic is willing to pay.

The Emergence of Mobile Creativity

As discussed in the Co.CREATE article “The Age of Mobile Creativity: Are We There Yet?”, mobile apps and ads are starting to be considered as legitimate canvases for artistic expression; for the first time, the Cannes Festival has developed a Lion award category for mobile media. This shift to viewing mobile work as potential artwork is certainly culturally appropriate, but it may be an honor given too early to be appreciated by a financially constrained mobile market. Revenue for mobile ads in 2011 came out to 1.6 billion; while this sounds impressive, it’s a mere fraction of the profit garnered by search ads ($14.8 billion) or display ads ($11.1 billion). Because so much of the market depends upon the money provided by ads, creators have little financial capital left with which to develop new, innovative concepts for mobile apps.

Fortunately, a lack of money hasn’t stopped these developers from thinking about what they want to produce and what they think will make an app succeed. According to some of the creative minds of mobile production, award-winning apps are not the ones that are visually striking or otherwise traditionally artistic; rather, they’re the apps that use the capabilities of mobile technology to provide users with new paths for social connection. As one of the mobile-using masses, I have to agree, especially when it comes to mobile games. When I want to have some fun with my mobile device, my favorite apps to pull out are the ones that allow me to play and interact with other people, like “Words with Friends” or “Draw Something.” The social component that games like these provide really enhances the mobile gaming experience!

Although the financial aspect to mobile creativity constraint is a legitimate one, some would claim that the biggest roadblock currently standing in the path of innovative mobile app development is the difficulty of providing a consistently positive experience over different brands and types of mobile device; few companies can lay claim to the resources necessary for app functionality across all platforms. Still, this is only the beginning for this outlet of expression; with time, it could very well become an artistic medium that redefines the connections between art, technology, and social media.

The Effect of Cloud Computing on Software Pricing

The transposition in today’s technology towards Cloud Computing means big things for lots of people.  There is an entire generation of new services and products that will have a sleek, far-reaching form of distribution.  This new method of disseminating software stands to do a lot for the pricing of traditional software packages and a lot for our wallets as well.

But how exactly will this change?  Instead of the burdensome bundles of Microsoft Office and other similar suites, Saikat Chaudhuri is cited in Financial Post’s article, “Cloud Computing Disrupts Software Pricing” as saying, “Instead of big suites, lightweight applications will become the norm.”  With the accessibility that comes with Cloud Computing, customers will be given more options on what specific products they want to buy.  This cuts out on the burdensome office suites, which allows consumers to pinpoint what software they need and not have to deal with the rest.  Mitchell Osak writes in his article, “Fading fast are the days when general-purpose software packages were sold in boxes with a one-time, perpetual software license fee plus expensive maintenance and upgrade charges.”  We don’t want to deal with those annoying prompts for updates and maintenance when we only plan on using the software for a limited period of time. This new distribution structure has the potential to drop all of the extraneous pieces of proprietary garbage, and what we get in exchange is a leaner system of purchasing software, one that reduces prices and caters to the needs of the customer.

As a college student, I am forward to this opportunity that Cloud Computing has opened up.  What we will see in the coming years is experimentation with subscription based services and pricing of software.  This will most certainly lead to cheaper prices for us software hungry masses.  I will have the ability to pick and choose software that I require for my classes, and not only that, but there will most likely be options for subscription services in addition to traditional licenses.  As long as the quality of the software stays the same, having more options when it comes to purchasing the software is strictly better than having fewer options, and I’m certainly not complaining.

The Rise and Fall of Nokia

In this article from PandoDaily, Farhad Manjoo predicts that Nokia’s end as a company is near. Specifically, he claims that Nokia’s reliance on other companies’ software is the primary reason why Nokia is currently struggling; whereas successful mobile companies such as Google and Apple provide the software and, occasionally, the hardware for the massively popular Android and iPhone devices, “Nokia…always thought of itself as being in the device business. It made hardware, and it only cared about software to the extent that it needed code to run on those devices.” A memo sent to Nokia’s employees by recently appointed CEO Stephen Elop supports Manjoo’s claim. In it, Elop points out that while Nokia had continued to create solid hardware, they had failed to create entire device ecosystems.

In spite of Elop’s timely analysis of the issue, Nokia is nevertheless in decline. Referencing Apple’s 1990s similar decline in the face of Microsoft’s market ubiquity, Manjoo suggests that Apple was able to regain strength by “changing the rules of the game”, offering a solution that went beyond simply providing PCs. Nokia, however, isn’t showing the same type of innovation. In an attempt to quickly reverse their deterioration, Nokia has partnered with Windows in distributing their mobile OS. However, their success now relies on that of the Windows mobile OS, and Nokia/Windows Lumia phones aren’t selling as well as either company would hope. While Elop has hopes for the upcoming Windows 8 phones, Manjoo is not as optimistic regarding Nokia’s overall fate.

On a broader scale, however, this article is very telling as to how the device market has changed in the past years. Nokia’s decline suggests that it’s no longer economically feasible for IT production companies to produce only one element of a total product and subsequently expect to be both independent from other companies and financially successful. Rather, to succeed in the current business climate, companies need to consider providing combined IT solutions to consumers’ problems.

The Development of Data Visualization

Data visualization techniques can be used to further a user’s understanding of massive data sets, and can be used for nearly any information type. Many popular data visualization techniques are adapted from already well-known aesthetics such as pie charts, heat mapping technologies, and subway maps. Additionally, data visualization standards have been influenced by the concepts of proximity, similarity, and size; many designers group associated topics near to each other and use large or small graphical representations for frequently or rarely occurring data points respectively. Smashing Magazine’s article “Data Visualization: Modern Approaches” gives a brief overview of particularly compelling data visualizations, including ChalkLabs CTO Bruce Herr’s visualization of Wikipedia articles.

While data visualization techniques are undeniably effective for static data sets (such as a city’s population demographics at a specific time), they’re also extremely useful when representing live or variable sets of information. For example, OnISP’s live call map is a live visual representation of all of the calls being made or received by phones using OnISP’s VoIP service.

This type of visualization is particularly effective due to its activity; simply listing how many users a system has or how many connections are made per minute can be informative, but it’s not particularly attention grabbing. In contrast, a real-time map tracking each connection made through OnISP gives the user a more concrete example of the significance of the represented data points. The user is able to personally experience the data, subsequently assisting in his understanding and processing of the represented information.

The increased use of data visualization is particularly important for businesses to keep in mind when considering how best to communicate pertinent information from their Big Data sets. Data visualization makes it easier for users to understand large, complicated data sets, ultimately increasing user ease of use and efficiency.

The White House Big Data Initiative

Big Data is the term applied to the general issue of the manipulation of increasingly large and complicated data sets. We take data, process it, analyze it, and then come to a conclusion based off of the given data.  The problem that users face is that the sheer volume of data through which they need to sort is becoming impractical given traditional methodologies.

In line with this growing need for Big Data solutions, the Obama Administration has recently announced the “Big Data Research and Development Initiative.” Pumping $200M of federal funding into a field largely dominated by open source communities, this initiative is full of promise. Security Week’s article, “Obama Administration Places $200 Million Bet On Big Data,” outlines the opinions of many White House officials on the subject, as well as the effects this decision will have on military and scientific fields.  “In the same way that past Federal investments in information-technology R&D led to dramatic advances in supercomputing and the creation of the Internet,” the Big Data initiative “promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education, and national security,” said Dr. John P. Holdren, Assistant to the President and Director of the White House Office of Science and Technology Policy.

In addition to the fields mentioned within the article, the Big Data Research and Development Initiative will also have a significant effect on commercial businesses.  Internet-based social media services and marketing groups have vast quantities of data at their disposal, but such quantities of information are difficult to process using traditional sorting methods.  Instead of having to manually sift through large quantities of complicated data to fully understand the state of your business, Big Data analytics is capable of finding patterns that common data crawling techniques seldom are able to find. Due to this increased ability to process information, companies using Big Data solutions are able to make faster and more informed business decisions than businesses that are still using traditional data processing methods; this ability to make informed decisions based on large quantities of data is an important facet of business intelligence. About 34% of organizations have reported applying Big Data analytics to large quantities of data and, in order to remain competitive, more businesses will need to begin utilizing Big Data solutions.

This overview is in response to Security Week’s article, “Obama Administration Places $200 Million Bet on Big Data“.