The influence of big data on monitoring the factual quality of digital media in Southern Africa.
Date
2022
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
This research study will explore how big data can drive innovation in response to dynamic change and aid
society in establishing an advantage when fact-checking/monitoring new media and dealing with false
information. The study emphasises that big data might answer questions and offer insights society never
had access to before. In the current news media environment, the services that enable the sharing and
production of large amounts of data are not sufficient to combat increasing fake news, ongoing public
mistrust, and false, partisan media content for capital gains from gaining more influence in society. There
is an urgent need for intervention, which big data innovation can provide. There are, however, some myths
regarding the use of big data that need to be dispelled, such as the idea that an analysis of the data will
ensure transparency and reliable content distribution from the developers of big data systems to the audience
consuming the data. Innovating and obtaining an advantage from data is more complex than just collecting
lots of data; a look at the impact big data will have on a society is vital in leveraging big data. The study
explores this notion by looking at the Digital Data Genesis Capability Model. The model guides the
structure and how the case study will be conducted in the media fact-checking sector. The development of
the big data initiative is built on fundamental expertise. According to the findings, highly skilled employees
with knowledge of both proprietary and open-source tools are essential in the development of big data
systems. Furthermore, there is a high level of compatibility with the existing web environment standard and
the tools being used when deploying a big data system in the web. As a result, development of a big data
initiative by a technology focused organisation is only limited by their ability to implement an effective big
data workflow. However, this requires detailed planning, cloud computing for hardware; software;
outsourced third party services; the work on data structure built in-house; and the use of docker containers
that enable mobility in the development process and the adoption of new technology when implementing
the searching and querying of large datasets and streams. There was a deviation from the existing model
noted. The context of the study exposed that it is possible to implement big data initiatives among more
than one company as a partnership, if the companies share some business traits or the same philosophy:
thus, changing the dynamic of routines and responsibility in the existing landscape.
Description
Masters Degree. University of KwaZulu-Natal.