We might need a Bretton Woods pact on data regulation

In September 1929, when the US stock market crashed, countries around the world rushed to put measures in place to protect their economies from the economic repercussions that began to be felt globally. However, rather than stemming the rot, these protectionist measures – tariff barriers, currency devaluation and discriminatory trading blocs – created an unstable international environment that ended up making matters worse. Many historians believe that without the advent of World War II, the repercussions of the Great Depression would have stayed with us much longer.

The experience of living through the 1930s was so unnerving for world leaders that in July 1944, with World War II in full swing, 44 Allied nations gathered in Bretton Woods, New Hampshire, to participate in what was officially called the United Nations Monetary Organization. and financial conference. By the end of the conference, the participating countries had decided that instead of each continuing to use the gold standard, they would make the US dollar the global currency for trade, which in turn would be compared to gold. For this new global arrangement to work, summit participants committed to a fixed exchange rate between their national currencies and the dollar, and to refrain from trade wars involving depreciation of their currencies to increase exports. This agreement, known as the Bretton Woods Agreement, also led to the creation of the International Monetary Fund, a multilateral agency from which member countries could borrow in order to adjust the value of their currency when they didn’t have the funds to do it themselves.

While opinion is divided on the success of the Bretton Woods agreement (especially since US President Richard Nixon permanently decoupled the US dollar from its gold anchor in 1971), there is little doubt that the agreement was responsible for how the world approaches the international flow of funds to this day. He created a playbook for multilateral policy-making that compels countries to cooperate with each other to achieve global goals, even if it requires them to make short-term national sacrifices.

In my article last week, I pointed out that there are currently three different approaches to data governance practiced by countries around the world. With each passing year, these differences hardened to such an extent that it became difficult to find common ground. And yet, given our growing reliance on data for almost everything we do, I argued that the fragmentation this divergence brings will lead to complications that we cannot afford. I have advocated for countries to find common ground by agreeing on a basic normative framework for governance, upon which we can then overlay regional variations as needed.

In a speech to the Oxford Internet Union in September last year, Elizabeth Denham, the UK’s former Information Commissioner, made an impassioned plea for a new Bretton Woods data pact. . “Our current approach to data protection,” she argued, “taken nation by nation, can only take us so far. If we are to unlock the full potential of data-driven innovation, backed by public trust in how data is used, we need an international approach to data protection standards. We need an international solution.”

How would this work in practice? First, Denham suggests that we should stop trying (as the EU has been doing since its General Data Protection Regulation came into force) to get every country in the world to adopt a uniform global law. . Instead, we should seek to create a global alliance, membership of which would be open to countries with a demonstrated commitment to data protection backed by independent regulation, which would allow them to transfer data at low risk to other member countries. To be clear, this sets the bar for opt-in far lower than a GDPR adequacy requirement would, but still provides individuals with the assurance that their data will be subject to the same protections around the world.

If we were to take this idea to its logical conclusion, we would need to create an entirely new multilateral institution for data that can be charged with the responsibility of coordinating a regulatory dialogue between nations in order to achieve a sufficient level of regulatory harmonization. In line with the role such an organization should play, it was suggested that it be established along the lines of the Financial Stability Board, with a mandate to oversee and make recommendations on global data governance and data flows cross-border.

Given the speed of modern data flows and the variety of different digital paths they can take, nothing less than a technological solution will ensure compliance with the common regulatory principles that will eventually be agreed upon. This will require the development of a normative framework that can be directly integrated into the national digital infrastructure, so that we can be assured that the data flowing through these systems meets basic data governance requirements. One of the main purposes of such a data stability board could be to develop a common set of technical standards and protocols for all aspects of data governance that Member States could hard-code into their national systems, thereby ensuring that they are interoperable, not just domestically, but with each other.

Rahul Matthan is a partner at Trilegal and also has a podcast under the name Ex Machina. His Twitter handle is @matthan

To subscribe to Mint Bulletins

* Enter a valid email address

* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our app now!!

Comments are closed.