From the view of financial inclusion, around 70% of Bangladesh’s population is unbanked. However, this is changing as more bank branches open in rural areas coupled with Fintech solutions such as mobile financial services (MFS). Alternatively, due to the country’s social structure, many individuals coerce family members to opt in, transfer and opt out of commercial contracts without their full knowledge or consent. In the wave of the projected digital transformation based on greater banking access this is extremely worrying. If the proper policy is not implemented to protect credit scores in the 4IR, many individuals can fall in the trap of poverty, simply because emotionless algorithms will dictate the narrative without understanding context.
A Domestic View
Although it can be hard to comprehend how algorithms can affect you let’s try to first understand how the country works today from a few viewpoints. First your identity in the country is your National Identification (NID) Card and/or your Passport. Both of these are for the majority electronic based and linked to most major databases. Your NID at least is directly linked to your tax file (you will notice that once you place your NID when applying for your E-tin all of your information is already there). Furthermore, if you go to the bank one of these documents are asked for and the account you create, after going through a standard Know Your Customer (KYC) process established by the Bangladesh Bank, will be tied to your CIB file in the central bank.
The CIB report simply acts as your credit rating score in Bangladesh. The majority of Bangladeshis deal with cash now until it comes to taking some form of loan for a car, real estate, etc. Usually you will need to show your bank statements, salary structures and/or deposit in an FDR account to get a credit card. People hence choose to use MFS services like Bkash or Nagad instead to avoid this difficult process. As agent banking is being rolled out this may include more people to rely on banks and cards.
Life is relatively quite different in a western nation like the USA. Their social security number (which is individual) to every person acts like our NID card but citizens are advised to not share with everyone. They are heavily reliant on credit and either use digital means for almost everyone aspect of their lives including personal items like education, mortgage, etc. to public services like health, unemployment benefits and more. With more data technologies progressing these processes are becoming greatly automated with invisible algorithms controlling the narrative. These could be a future way of life coming to Bangladesh.
Family Ties and Financial Disputes
Now take a step back and imagine a hypothetical scenario. A young girl gets married to an older man. The older individual has a better understanding of finance and reassures his wife that he will take all of the financial decisions. He takes out loans and maxes out credit cards in a joint manner or solely in the name of the female partner. The female trusting the male companion with this knowledge and confidence by signing in all documents does not know what has happened until the bank calls in the debt or the NBR starts filing for tax evasion. Couple this with the fact that if all these were done by automatic credit scores in CIB reports then who would you blame? If we can recall this is all too common a scenario in not only domestic partnerships but also between siblings and extended family in Bangladesh.
Caught by Coerced Debt
What the above scenario displayed was a predicament called coerced-debt. It is a form of abuse generally committed by an intimate partner or family member. Although economic abuse is a long-standing problem, digital banking will make it more convenient to open accounts and take out loans in a victim’s name.
Reach of the Algorithms
When the era of automated credit-scoring algorithms flourish in Bangladesh, there may be repercussions that can also be very detrimental. In the western world credit scores have been used for decades to assess consumer credit worthiness. However, their range is far more potent now that they are backed by algorithms which incorporate more data types and volume. These algorithms, if implemented in Bangladesh, will increasingly affect what car you can use, which apartment you get, or whether or not you can get a permanent job. Just like China their extensive sway infers that if your credit rating becomes too low, it can be nearly impossible for you to recover (something straight out of the Netflix show Black Mirror).
Even worse, they are owned by private corporations who do not want to share their decision making process. The potential victims of coerced debt could be sent in a downward spiral that may unfortunately end in being forced onto the street or a return to their culprit for support.
Moreover, credit-scoring algorithms solely may only not affect individuals’ economic well-being and access to consumer services. As Bangladesh flourishes and becomes a welfare state, algorithms can decide foster care placement options, medical quotas and housing allocations for families.
Bearing the Brunt
If this rapid growing technology of automated decision-making systems is adopted without the right policy, it will create an unseen web that has many interconnected traps. Higher income groups can pass their lives unaware of all this. It is a different reality for low-income society members. Low-income individuals bear the majority of the consequences of the shift toward algorithms. They are the people most vulnerable to temporary economic hardships who get labelled into consumer reports. They are also the people who need and seek government benefits. Everyone strolls through many systems a day but the consequences are harder for the ones nearer the poverty line.
Cases to Consider
We can learn from the growth and conflict of two algorithmic webs that is prevalent in the U.S. The first one is credit-reporting algorithms which affect access to private goods and services like cars, homes, and employment. The second would encompass those adopted by government agencies, which affect access to public benefits like health care, unemployment, and child support services.
On the side of credit-reporting, the growth of algorithms has been driven by the increase of data, which we can easily collect and share. While credit reports aren’t new, nowadays their footprint is far more spread-out. Agencies collect this report from a wide range of sources: public records, social media, web browsing, banking activity, app usage, and more. The algorithms can then assign people “acceptability” scores, which figure heavily into background checks performed by lenders, employers, landlords, even schools.
Conversely, government departments will find it fruitful to adopt algorithms when they want to modernize their systems. The government may adopt web-based apps and digital tools with a move toward more data-driven automated systems and AI. In the U.S., for example, during the pandemic, many unemployment benefit systems struggled to handle the massive volume of new requests, leading to significant delays. Modernizing these legacy systems promises faster and more reliable results.
Ghost in the Machine
However, the software procurement process is not always transparent, and thus may lack accountability if influenced by corruption (a predicament of many developing nations). Government departments often purchase these digital tools directly from vendors in the private sector. The problem is that when systems are problematic, it is not advertised to the procurer, their lawyers or anywhere. The lack of public vetting also makes the systems more prone to error.
The Legal Silver Lining
If an algorithm gets something wrong, you cannot hold an algorithm responsible nor can you put it on the stand to cross-examine. In 2014 in Arkansas, USA this is exactly what happened when it had barred previously suitable candidates for Medicaid services. Moreover, the people who may be representing the algorithm may not be data scientists but rather nurses, bankers, teachers, social workers and others who have no knowledge of how the algorithm placed a certain individual’s credit score where it was for them to get sub-par service.
Every case in the future might be an algorithm case for lawyers. Lawyers need to be prepared about consumer law, family law, housing, and public benefits, issues raised by algorithms and other data-driven technologies within the scope of existing laws. Lawyers need more training, more knowledge—not just in the law, but in these systems. Moreover, civil lawyers may need to do the same thing: create a movement to bring more inquiry and regulation to the hidden web of algorithms.
Victims of Coerced Debt abuse should learn how to manage finances. As these algorithms develop they should learn more about credit algorithms themselves. Moreover, if anyone has been a victim of abuse they should check their credit report daily for fraudulent activity. The government can partner with NGOs to offer the ones who have suffered training on these issues.
Looking back at a situation like the pandemic, one faulty missed payment can be judged with empathy by a human being. However, an algorithm will not care why someone should not receive a much needed loan for supporting their business. This may have a ripple effect on the individuals existing housing payments, especially if more algorithms judge their inability to pay utilities as more reason to further lower credit scores. As such, before we go from the world of cash to cashless, it’s important that we know how to navigate it first on an individual, organizational and government level.