We include here a a list of curated readings and materials covering a broad range of themes that will be discussed in the workshop.

Background References

(a) Exclusions in the transition to Aadhaar-enabled service delivery systems

(b) Limitations of biometric authentication and the need for alternatives

(c) Aadhaar and the creation of a new centralized service delivery model

(d) Violations of democratic accountability in the Aadhaar project

(e) Trends in the use of Big Data in governance: India and abroad

(f) Meaningful participation in digital democracy

(g) Involvement of private actors in digitalized governance systems in India

(h) Issues/concerns for the governance of data-in-governance

(a) Exclusions in the transition to Aadhaar-enabled service delivery systems

– How poor and marginalized groups fall out of the welfare net in the switch to Aadhaar-enabled service delivery. See Medianama (2016), Aadhaar disruption in Rajasthan: Not in a good way,

– Errors in biometric authentication lead to food rations being denied. See Scroll (2016), In Rajasthan, there is ‘unrest at the Aadhaar shop’ because of error-ridden Aadhaar,

– Claims of efficiency and effectiveness in Aadhaar-enabled service delivery are not being borne out. See Khera (2016), On Aadhaar success, it’s all hype – that includes the World Bank,

(b) Limitations of biometric authentication and the need for alternatives

– The accuracy of biometric identification depends on the chance of a false positive: the probability that the identifiers of two persons will match. Individuals whose identifiers match might be termed duplicands. When very many people are to be identified success can be measured by the (low) proportion of duplicands. The Government of India is engaged upon biometrically identifying the entire population of India (through Aadhaar). An experiment performed at an early stage of the programme has allowed us to estimate the chance of a false positive: and from that to estimate the proportion of duplicands. For the current population of 1.2 billion the expected proportion of duplicands is 1/121, a ratio which is far too high. See Verghese Mathews (2016), Flaws in the UIDAI Process,

– Smart Cards may be more reliable than biometric authentication but this is not being considered. See eGOV (2016), Government restricts smart cards: promotes Aadhaar,

(c) Aadhaar and the creation of a new centralized service delivery model

– In May 2016, following the enactment of the Aadhaar Act, the Secretary of the Department of Electronics and Information Technology announced a plan to create a convergent database of beneficiaries for Aadhaar-enabled service delivery, positioning this as the foundation for an efficient welfare regime in the country1. The Secretary also shared that the government is contemplating the handing over of powers of updation of this database to CEOs of janpad (block level administration) in rural areas, and to the Chief Municipal Officer in urban areas. This arrangement may not guarantee the decentralisation of discretion necessary at the last mile for responsive action on beneficiary identification and selection. See GovernanceNow(2016), Using data to improve social welfare schemes,

(d) Violations of democratic accountability in the Aadhaar project

– The passage of the Aadhaar Bill was nothing but a travesty of democracy. Even basic parliamentary procedures were disregarded completely. The passage of the Bill also constituted the subversion of an ongoing judicial process. Secondly, the Bill has a number of provisions that put to risk, and offer no protection to, a number of constitutional rights and liberties of citizens. Thirdly, the financial savings that the government has claimed from the usage of Aadhaar, based on which the idea has been sold, are based on wrong data. See Ramakumar (2016), Freedom in peril,

– The power to declare which services require the Aadhaar number for authentication is too awesome a power to be handed over to the central government in the form of a blank cheque; the parliament should require that all subsidies and services be approved by it before the legislation is enacted. See Thikkavarapu (2016), The Aadhaar Bill is yet another legislation that leaves too much power with government at the centre,

Aadhaar makes citizen-profiling all the more easier. See Ramanathan (2016), Aadhaar is like drone warfare versus hand to hand combat, profiling becomes all that more easier,

– The Aadhaar Act makes no mention of “privacy” and only refers to “confidentiality” in its provisions2. In specific, it restricts access to the identity information and authentication records stored on the UIDAI database, in order to protect the security and confidentiality of individuals. However, thanks to two broad exceptions to this restriction, the provision is significantly diluted:

(a) Exception 1: District judges can pass orders that authorize state agencies’ access to Aadhaar data without any disclosure or discussion with the citizen affected, and without any avenue for appeal.

(b) Exception 2: In the interest of ‘national security’, any Joint Secretary authorised by the government can direct disclosure of information. See Arun (2016), Privacy is a fundamental right,

Aadhaar creates a financial architecture that is completely unauditable. See Saraph (2015), Banking on Aadhaar, What concerns the RBI,

(e) Trends in the use of Big Data in governance: India and abroad

In India:

CIS (2016), Big data in Indian governance: Preliminary findings,

Economic Times (2014), PMO using Big Data techniques on,

Trends abroad:

BBC (2015), China’s social credit: Beijing sets up huge system,

Economist (2014), Parole and technology: Prison breakthrough,

(f) Meaningful participation in digital democracy

– Exploring public Wi-fi as a viable route for enhancing affordable access to the Internet. See TRAI, Consultation Paper on Proliferation of Broadband through Public Wi-Fi Networks,

–Ensuring that freedom of expression in online spaces is not curbed. See P. Visvaksen, UN Condemns Disruption of Intternet Access ,

– Access enables meaningful participation only when there are use-cultures that expand informational, associational and communicative choices. So to ensure meaningful online participation, just providing connectivity may not be enough. See Gurumurthy, A. and Chami, N. (2015), The Internet as a game changer for India’s marginalised women,

– Why net neutrality legislation needs to be understood as a political idea and not a limited technical principle of maintaining a level playing field, if we are to democratize online participation. See Parminder Jeet Singh (2015), Net Neutrality is Basically Internet Egalitarianism,

(g) Involvement of private actors in digitalized governance systems in India

– Corporatization of service delivery in the Common Service Centres scheme. See Salman (2016), Kerala’s e-gov centres might sell products from Jios and other companies,

– Common Service Centres tend to cater only to middle class groups and not to poor and marginalized individuals, as their primary motive is profitability. See Kuriyan and Ray (2007), Public-Private Partnerships and Information Technologies for Development in India,

–How the Smart Cities Mission leads to a new privatized model of urban governance in the country. See Sampath (2016), Fooled by smartness,

– Opening up urban governance as a new market for data companies. See Sundar (2016), Are Smart Cities a Smart Idea?

– Aadhaar-based open Application Programming Interfaces as a new business opportunity for companies interested in creating data-based solutions in health, education, financial services etc. Livemint (2016), Aadhaar 2.0: Creating India’s digital infrastructure,

(h) Issues/concerns for the governance of data-in-governance

– Identifying design principles that ensure cradle-to-grave secure management of information in the design of data systems. For example: “…large data systems should store data in a distributed manner, separated by type (e.g. financial vs. health) and real-world categories (e.g. individual vs. corporate), managed by a department whose function is focused on those data, and with sharing permissions set and monitored by personnel from that department. Best practice would have the custodians of data be regional and use heterogeneous computer systems. With such safeguards in place, it is more difficult to combine data types without authentic authorization. Similarly, data sharing should always maintain provenance and permissions associated with data…Best practice would share only answers to questions about the data (e.g. by use of preprogrammed SQL queries known as Database Views ) rather than the data themselves, whenever possible.”

See World Economic Forum (2016), Rewards and Risks of Big Data,

– Robust legislation for privacy and data protection. See the report of the Justice A.P. Shah Committee (2012) at Robust legislation for privacy and data protection. See the report of the Justice A.P. Shah Committee (2012) at

Evolving an institutional framework that recognises the need to socialise data as an anonymised big data commons, with adequate room for context-specific, political decisions about what should remain at the level of the ‘private’ and what should be ‘public’. One way to do this is illustrated in the publication ‘Big Data in Our Hands’ produced by the P2P foundation, which recommends that:

– At the individual level: users no longer readily surrender data
– At the regulatory level: the monopolisation of data can be challenged.
– At the social level: people are able to participate (a) in an existential part of the environment [i.e. creating and sharing data], (b) in political processes (decision making on rules, distribution, etc.), and (c) in economic processes (in which “my” data becomes a potential economic resource which I am able to exploit myself, or I can have it exploited by third parties).

See more details at

– Introducing clear protocols for Big-data based decision-making within governance systems. We can take a leaf out of EU’s General Data Protection Regulation slated to take effect in 2018, which gives all EU citizens the right to ask for an explanation of algorithmic decisions that were made about them; and disallows the automated processing of personal data for the purpose of personal profiling. Some readings on this:

EU citizens might get a ‘right to explanation’ about the decisions algorithms make

Bryce Goodman and Seth Flaxman (2016), European Union Regulations on Algorithmic Decision-making and a “Right to Explanation,

Kate Crawford and Jason Schultz, Big Data and Due Process : Towards a Framework to Redress Predictive Privacy Harms,