Hand on your heart, could you trust a ‘Social Credit’ system like the one being piloted in China?

China is building artificial intelligence controlled high-tech surveillance systems, with facial recognition, body scanning and geo-tracking to monitor citizens, coupled to your shopping, internet browsing habits, what you say, friends you associate with and your family’ behaviour, and finally overlaid with traditional big data from government records, containing educational and medical, state security assessments and financial data, will determine your Social credit scores. The resulting score determines whether you receive preferential treatment or punishment and confinement. Is this Human Capital Management 2.0?

Logically, it is exceptionally clever. However, who is controlling the scoring criteria? What happens if the system is hacked and someone takes control over it? The phrase too many eggs in one basket comes to mind.

How can computer algorithms genuinely appraise human behaviour? We are only just discovering our abilities and potential ourselves. We, humans, are more emotionally directed beings, rather than (cold) logical computer and so may that last. So how can AI understand and grade our emotional behaviour? 

Does this appear to be giving the control of many, to a few humans, and AI? It will be interesting how this pans out.