“Broadcast ratings is like a monster that we have to learn to live with. If the ratings make everybody equally unhappy, then they are actually doing something!”
Senior journalist Arjuna Ranawana, who has worked with the Canadian Broadcasting Corporation (CBC) for several years, made this observation at the July 2017 launch of a new study assessing the broadcast ratings system in Sri Lanka. In mid-2016, the Ministry of Mass Media appointed a six-member panel of experts to look into the methodology, perceptions and related issues surrounding radio and TV ratings systems in the country. (Disclosure: I was a member of this panel, along with four university academics and an independent media researcher.)
We too worked from the basis that ratings were a necessary, if imperfect, tool. So, how can the system be made more rigorous and credible? What should be the government’s role in this private sector-driven activity?
Our report, now published (see: https://goo.gl/EZNPNq), has tried to answer these and other questions. It recommends that ratings should continue as a private sector activity, but with some degree of monitoring and quality assurance by the government and other stakeholders.
RATINGS AND BASHINGS
Broadcast ratings data is the media industry’s “currency” to understand media consumers’ preferences and engagement with TV and radio. Based on a sample’s media habits, ratings quantify the popularity of the content. Using these insights, media managers and content creators optimise their programming, while advertisers choose channels and times more efficiently.
Ratings systems originated and evolved in the United States, initially using manual methods that were later automated. Nielsen Media Research-pioneered methods (for radio in the 1940s and TV in the 1950s) were later adapted in many countries.
In Sri Lanka, ratings services emerged a few years after the broadcast sector was liberalised in the early 1990s. As the number of channels increased, advertisers needed a basis for distributing their advertising. Every channel claimed they had the highest reach, so an external measure became necessary.
[pullquote]Currently, two market research companies operate audience rating systems on radio and TV, i.e. Lanka Market Research Bureau (known as LMRB) and Survey Research Lanka (SRL)[/pullquote]
Currently, two market research companies operate audience rating systems on radio and TV, i.e. Lanka Market Research Bureau (known as LMRB) and Survey Research Lanka (SRL). Both are subscriber-driven services.
Their ratings are used by advertisers, media buying companies and some (but not all) broadcasters. Over the years, some TV and radio stations have expressed dissatisfaction with the ratings methodologies and outcomes. One of them even filed a case in the District Court of Colombo in 2014 against a ratings provider, but the case was later settled without the allegation of ‘improper research’ being proven.
Ratings matter because they have a high level of influence over the economic viability of broadcast stations, as well as on programme scheduling decisions. Billions of rupees worth of advertising decisions are made using this data. As the state entity that issues all broadcast licenses, the Media Ministry had received various complaints about the ratings system, which led to the expert panel being appointed.
CONTENTIOUS DATA
The expert panel had lengthy discussions with everyone involved: managers at (state and private) broadcast stations; senior management of LMRB and SRL; the advertising industry’s association (4As); and marketing managers of several large companies that were Sri Lanka’s top 10 advertising spenders in 2015. We also surveyed the media landscape, and looked at international best practices.
Broadcasters had a long litany of grievances about how ratings are calculated. Some were not satisfied with the geographical and linguistic distribution of People Meters (used for tracking TV viewing) or diaries (for radio listening). They claimed that the current methodologies were unfair by smaller broadcasters that cater to numerically small but culturally significant niche audiences such as Tamil or English speakers. But, ratings companies did not agree. LMRB said they have installed 600 People Meters all over the country using a “highly scientific, theoretically sound sampling” method. Assuming 4 family members over 4 years of age in each household, this provides a sample size of 2,200 viewers for daily reporting.
The Census in 2012 counted 5,251,126 households in Sri Lanka, and found that 78.3% of these (or 4.1 million households) had at least one TV and 68.9% (3.6 million) owned a radio. Vehicle-mounted radio or TV receivers were not counted. For planning purposes, the media industry estimates Sri Lanka’s cumulative broadcast audience at around 14 million persons.
Thus, notwithstanding statistical nuances, tracking the viewing habits of 2,200 persons seems inadequate. Ratings companies say, given the high cost of installing and maintaining automated tracking units, it is unrealistic to increase the sample for greater representation.
“We can increase the sample size if subscribers are able to meet the extra cost,” noted Himalee Madurasinghe, LMRB’s chief executive, at the report launch. But, that too seems unlikely: ratings reports already come at a high subscription fee, taking them beyond the reach of smaller broadcasters and advertising agencies. One of our recommendations is for the government to grant duty concessions for importing these meters. Attempts to make them locally have not been successful.
OVERSIGHT BODY
A bigger problem that ratings companies face is how to deal with unethical practices by certain broadcast stations that try to manipulate the process by confusing the participating households (whose identity and locations are kept confidential). For example, some channels have been running promotions offering cash prizes for audience members who cite theirs as the “favourite channel” if and when anyone asks.
Being private companies, LMRB and SRL do not have punitive powers.
When confronted with unethical practices on the part of broadcasters, the ratings companies have sometimes threatened to withhold the ratings of the offending stations – but, in reality, no such action has been taken. This highlights the need for an independent dispute resolution mechanism, our report notes. Overall, we highlight the urgent need to create a ‘level playing field’ for all broadcasters to have their programming assessed on a more transparent basis and have their audience share determined through a more credible process.
“The government cannot – and should not – get into the task of producing broadcast ratings,” the report says emphatically. It adds: “The panel is convinced that the best way forward is to strengthen the current ratings service providers to do a better job.”
For this, we have recommended setting up an independent monitoring body with multi-stakeholder participation (i.e. with suitable representation from the Media Ministry, state and private broadcast companies, the advertising industry and broadcast rating service providers, along with eminent researchers or academics). Its overall mandate would be to monitor and validate the existing and future ratings systems; provide technical oversight on ratings methodologies; carry out periodical audits of ratings; serve as a dispute resolution body and complaints investigation mechanism; and investigate unethical promotions and attempted manipulations by broadcasters.
Such a body needs to be independent of the state, as well as of the ratings service providers and their subscribers. How it is financed remains to be decided.
OTHER CHALLENGES
Everyone we met during our hearings agreed that broadcast ratings only offer indications of the relative popularity of a programming timeslot. They cannot measure the quality of content (which is highly subjective, anyway).
As our report notes, “Ratings are not, and cannot, be an accurate measure of the quality standard of content because audience preferences can be based on factors other than quality. Indeed, in many media markets, there is a growing disconnect between high quality programming content and what is rated as ‘popular’. This is not the fault of ratings per se, but a larger issue of the market-driven media economy.”
[pullquote]Overall, we highlight the urgent need to create a ‘level playing field’ for all broadcasters to have their programming assessed on a more transparent basis and have their audience share determined through a more credible process[/pullquote]
Indicators of content quality can include the number of broadcast awards won. Sri Lanka currently has three annual awards schemes: Sumathi Awards (started in 1995), Raigam Awards (2003) and State Television Awards (2004).
Ravi Jayawardena, chairman of the state TV broadcaster Rupavahini, laments that advertisers are entirely ratings driven, and ignore awards and other factors. His station has bagged more awards last year than any competitor, he says, but it is not a ratings leader. And, what new challenges are emerging as our media market matures? Laksiri Wickramage, deputy chairman and chief executive of Derana, noted at the launch that cable TV already covers 22% of households. As this share increases, the real competition will be with dozens of foreign channels, and not among local terrestrial channels, he said.
And, since protectionism is not the answer, Lankan broadcasters will need to adapt and innovate to retain their audiences