- Quick Links for
Big data is a broad term for data sets so large or complex that traditional data processing applications are inadequate. They are characterized by the increasing complexities such as Volume (the quantity of data that is generated is very important in this context), Variety (the category to which Big Data belongs to), Velocity (how fast the data is generated and processed to meet the demands), and Veracity (the quality of the data being captured can vary greatly. Data sets grow in size in part because they are increasingly being gathered by cheap and numerous information-sensing mobile devices, and computer and sensor networks. The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s. As of 2012, every day 2.5 exabytes (2.5x1018) of data were created. Relational database management systems and desktop statistics and visualization packages often have difficulty in handling big data.
Big Data Technology poses a grand challenge on the analysis, capture, curation, search, sharing, storage, transfer, learning and modeling, visualization, and information privacy of the Big Data. This includes the design of efficient and effective algorithms and systems to integrate the data and uncover large hidden values from datasets that are diverse, complex, and of a massive scale. Many breakthroughs are expected in this area in new algorithms, methodologies, systems and applications that discover useful and hidden knowledge from the Big Data efficiently and effectively and to do so in data sources across diverse areas of practice and applications.
The educational objectives of this minor program are:
For program requirements and further details refer to the catalog - Undergraduate Minor Program in Big Data Technology.
The non-CSE course elective list is empty at the moment. Courses will be added here upon their approval as an elective.