President Trump’s “mental health” comments and discussion of an initiative for such in light of the recent “mass shootings” (as while not to deny the existence of the shootings, there have been far too many reports from on-the-ground observers that there was more to each story to be told in the actual event that was was officially acknowledged or reported on) generated concern over the building of a “social credit” type system that exists in China but for gun owners where one’s “mental health” would be determined based upon a series of artificial factors including but not limited to his particular purchases, movement, and even associations:
The White House has been briefed on a proposal to develop a way to identify early signs of changes in people with mental illness that could lead to violent behavior.
Supporters see the plan as a way President Trump could move the ball forward on gun control following recent mass shootings as efforts seem to be flagging to impose harsher restrictions such as background checks on gun purchases.
The proposal is part of a larger initiative to establish a new agency called the Health Advanced Research Projects Agency or HARPA, which would sit inside the Health and Human Services Department. Its director would be appointed by the president, and the agency would have a separate budget, according to three people with knowledge of conversations around the plan.
HARPA would be modeled on DARPA, the highly successful Defense Advanced Research Projects Agency that serves as the research arm of the Pentagon and collaborates with other federal agencies, the private sector and academia.
The concept was advanced by the Suzanne Wright Foundation and first discussed by officials on the Domestic Policy Council and senior White House staffers in June 2017. But the idea has gained momentum in the wake of the latest mass shootings that killed 31 people in one weekend in El Paso and Dayton, Ohio.
The Suzanne Wright Foundation re-approached the administration last week and proposed that HARPA include a “Safe Home” — “Stopping Aberrant Fatal Events by Helping Overcome Mental Extremes” — project. Officials discussed the proposal at the White House last week, said two people familiar with the discussions. These people and others spoke on the condition of anonymity because of the sensitivity of the conversations.
The attempt to use volunteer data to identify “neurobehavioral signs” of “someone headed toward a violent explosive act” would be a four-year project costing an estimated $40 million to $60 million, according to Geoffrey Ling, the lead scientific adviser on HARPA and a founding director of DARPA’s Biological Technologies Office.
“Everybody would be a volunteer,” Ling said in an interview. “We’re not inventing new science here. We’re analyzing it so we can develop new approaches.
This is indeed the Chinese social credit system, for as we have discussed many times, it end goal is the same, and that is exerting control over a people by “monitoring” most or all of their actions. However, the difference between the US and China is that the Chinese system would come by means of open force, threats, and violence. The Americans are far less likely to do this because it would likely cause a social backlash and lead to chaos, legal action, or unwanted and unnecessary social instability.
The way that the Americans would have to sell such a system to the public would have to be under a series of ever-more restrictive laws and regulations whose purpose does not deal directly with the issue at hand of a “social credit” system but is structured in such as a way that it brings about the “unintentional” creation (although it was always intentional) of the same thing, but even more restrictive and that will be embraced by the people. While the Chinese focus on making the “path” and carving it out piece-by-piece through the social terrain with all force necessary, the US will see the end goal first and then will reshape the social terrain without changing the public feel of it, and in the course of doing this get the public to participate in her own enslavement with a happiness coming from a sense of community participation and national identity that truly does not exist organically.
People look at China’s system and see a series of facial-recognition cameras everywhere, following people in the style of Orwell’s 1984. This is obviously bad. However, the Americans in true to their national fashion, are not at his level yet in all areas, but are promoting “facial reading technology” in the name of making life easier, such as with monitoring one’s home through Google Nest:
Google Home and Nest Hub gadgets already feature microphones that are always listening for the words that wake up the Assistant (“OK, Google” or “Hey, Google”). Now, the search giant’s newest gadget for your home, the Nest Hub Max smart display, adds in a camera that’s always watching for a familiar face.
Google calls the feature Face Match, and it uses facial recognition technology to remember what you look like. After that, you can tap on the screen to see personalized bits of data like calendar appointments and Google Duo messages whenever it recognizes you.
The Nest Hub Max isn’t the first product to bring facial recognition technology — and the legal and ethical considerations that come with it — into people’s homes. Smart phones have been using the technology to let us unlock our devices and authorize purchases for years, and a growing number of smart home gadgets that use cameras are putting it to use, too, including Google’s own Nest Hello video doorbell.
Now at this point, anybody who trusts anything that any American “tech company” has to say about privacy is likely not the “sharpest tool in the box” because tech companies have consistently proven that they do not care for their own policies that they threaten their users with lawsuit over if such do not comply, as they will pursue “legal action” against the consumer yet when the same companies are caught selling personal data explicitly labeled as a not-for-sale item, they will only face comparatively small fines but no one ever goes to jail or faces personal ruin.
If these companies (pick your name, it does not matter which one) will sell your data with such glee, what is to be said about your home details, face, and body? Clearly if such are being monitored using this technology, would these companies, who have a track record of lying openly, not attempt to sell such?
“If camera sensing is enabled and the camera is on (i.e., not turned off via the hardware or software switch), then the camera is continuously processing pixels to look for faces and/or gestures,” a Google spokesperson explained. “This processing is done locally on the device, and no pixels leave the Nest Hub Max.”
Google’s representatives made a point of emphasizing the fact that the Nest Hub Max’s Face Match and gesture-tracking features don’t involve the cloud at all.
“Google’s AI camera-powered features are happening on the device itself,” said Ashton Udall, Google’s product lead for smart displays. “When you do auto-framing in video calls, or when you do Face Match, or when you do Quick Gestures, all of that is happening on the device. We don’t have to send things to the cloud in order for those things to happen. What happens in your home stays in your home with relation to these things.”
That’s good to hear, but when I began setting up the Nest Hub Max to test the device out, I saw the following disclaimer in the Google Home app:
“Face Match creates a unique model of your face that your Assistant uses to recognize you. This face model is stored on this Nest Hub Max and used to identify you when you’re in front of this device. It’s also temporarily processed at Google from time to time to improve the quality of your experience with this device.”
I asked Google to explain that last line to me.
“The images you provide are used to build your face model, which is stored on your device,” the spokesperson said. “However, we occasionally use the images you provide during setup to generate a face model in the cloud for a couple of reasons, all related to improving your product experience specifically on Nest Hub Max, and motivated by the fact that we have more computing power available in the cloud.”
So it uses the “Google Cloud”, but it doesn’t use it at the same time.
Don’t bother asking the question “which one is the answer- yes or no”, because the answer is that it is being used and you, the consumer, are being lied to. This is a logical deduction based upon the simple logic of the statement and the nature surrounding it in light of the history of such companies. It’s more obvious of a lie and clearer than Bill Clinton’s allegations of innocence of sexual misconduct during his Presidency.
Why else would Google need to “temporarily process” images at its facility? What is involved in such “processing,” and where do such images and data go, and for what ends?
Now I cannot tell anybody “well it goes here,” because I do not objectively know. However, it is a proven fact that Amazon is earning tremendous amounts of money serving as a data and technology contractor with the CIA.
In the words of Mike Pompeo, the CIA is paid to lie, cheat, and steal.
Does one really trust Google to tell the truth when she has made herself a resting place within a den of thieves and gives herself over to them for money just as a whore does for clients?
Like the original, smaller-sized Nest Hub, which doesn’t include a camera at all, the Nest Hub Max features a kill switch behind the screen that disables the microphones. Now, that switch disables the camera, too.
That’s a step short of including a physical shutter that covers the camera entirely, a feature that consumers often appreciate with devices like these. Other Google Assistant smart displays, including the Lenovo Smart Display 10 and the JBL Link View made sure to include one. So did the Facebook Portal.
You can flip that switch in the back of the device to turn off the microphones and the camera. But, it won’t physically cover the camera lens like the shutters in other smart displays.
Many users prefer the sense of privacy offered by a shutter that they can leave closed when the device isn’t in use, especially if they plan to keep it somewhere like a bedroom. Amazon seemed to figure that out in between last year’s second-gen Amazon Echo Show smart display, which lacked a shutter, and this year’s Amazon Echo Show 5, which added one.
When asked about the lack of a shutter, Google defended its design by downplaying the distinction between kill switch and shutter altogether.
“We’ve included a mic + camera switch that electrically disables both the camera and mics, making it functionally equivalent to a physical camera shutter,” said a Google spokesperson. (source, source)
Of course, the camera has been intentionally manufactured so that it is not able to be covered without extra efforts, and neither can it be turned off naturally.
Why else would anybody do this unless they had an interest in deceiving people so they could willingly allow themselves to be monitored without their consent?
But it is not just “big companies” who do this. The authorities are making use of the laws, in the name of “public security”, to force other companies to hand over data on people. Such was a recent case of a gun sight app for phones in which the government is now demanding the company hand over all of its data regardless of the person, which amounts to over ten thousand users:
Own a rifle? Got a scope to go with it? The U.S. government might soon know who you are, where you live and how to reach you.
That’s because the government wants Apple and Google to hand over names, phone numbers and other identifying data of at least 10,000 users of a single gun scope app, Forbes has discovered. It’s an unprecedented move: Never before has a case been disclosed in which American investigators demanded personal data of users of a single app from Apple and Google. And never has an order been made public where the feds have asked the Silicon Valley giants for info on so many thousands of people in one go.
According to an application for a court order filed by the Department of Justice (DOJ) on September 5, investigators want information on users of Obsidian 4, a tool used to control rifle scopes made by night-vision specialist American Technologies Network Corp. The app allows gun owners to get a live stream, take video and calibrate their gun scope from an Android or iPhone device. According to the Google Play page for Obsidian 4, it has more than 10,000 downloads. Apple doesn’t provide download numbers, so it’s unclear how many iPhone owners could be swept up in this latest government data grab.
If the court approves the demand, and Apple and Google decide to hand over the information, it could include data on thousands of people who have nothing to do with the crimes being investigated, privacy activists warned. Edin Omanovic, lead on Privacy International’s State Surveillance program, said it would set a dangerous precedent and scoop up “huge amounts of innocent people’s personal data.”
What is the justification being given? Of course it is for “public safety”, “combating crime,” and “stopping terrorism” as the reasons are given. According to the government, they are citing “illegal exports to Canada, the Netherlands, and Hong Kong,” and HELPING THE TALIBAN, but no specifics were given as to who is being investigated, and it is being said that the company is not being investigated. With this list is also the personal user data as well as locations of where the app is being used as pinpointed by IP address and geopositioning at the time:
The Immigration and Customs Enforcement (ICE) department is seeking information as part of a broad investigation into possible breaches of weapons export regulations. It’s looking into illegal exports of ATN’s scope, though the company itself isn’t under investigation, according to the order. As part of that, investigators are looking for a quick way to find out where the app is in use, as that will likely indicate where the hardware has been shipped. ICE has repeatedly intercepted illegal shipments of the scope, which is controlled under the International Traffic in Arms Regulation (ITAR), according to the government court filing. They included shipments to Canada, the Netherlands and Hong Kong where the necessary licenses hadn’t been obtained.
“This pattern of unlawful, attempted exports of this rifle scope in combination with the manner in which the ATN Obsidian 4 application is paired with this scope manufactured by Company A supports the conclusion that the information requested herein will assist the government in identifying networks engaged in the unlawful export of this rifle scope through identifying end users located in countries to which export of this item is restricted,” the government order reads. (The order was supposed to have been sealed, but Forbes obtained it before the document was hidden from public view.) There’s no clear stipulation on the government’s side to limit this to countries outside of America, though that limitation could be put in place.
It’s unclear just whom ICE is investigating. No public charges have been filed related to the company or resellers of its weapons tools. Reports online have claimed ATN scopes were being used by the Taliban.
If the court signs off on the order, Apple and Google will be told to hand over not just the names of anyone who downloaded the scope app from August 1, 2017 to the current date, but their telephone numbers and IP addresses too, which could be used to determine the location of the user. The government also wants to know when users were operating the app. (source, source)
Back in 1979, after a series of continual attack in Central Asia from “Islamic terrorists”, the Russians decided to attempt to remedy the problem by moving into the area they were. These “attacks” were instigated by the US government, who was backing said terrorists. When the Russians did move into the area where they were, which is the nation of Afghanistan, the US then proceeded to send over the next decade tens of billions of dollars of advanced weaponry, bombs, guns, and provide logistical and military training to the Islamic militant there. This was the formation of the Afghan mujahideen of which was later renamed as the “Taliban” as a part of the well-documented CIA program under Gladio entitled Operation Cyclone.
Why such an emphasis on a small company? The single answer to this would be to take personal data, to track people, and to compile lists for an unspecified future use, likely aimed as seizing people’s personal arms in the name of “national security” or some other regulation, and attempting to find where said weapons may be. At the same time, the same individuals doing this are happy to train terrorists and dump heavy weapons upon them, so long as they believe it serves their particular ends.
But take the government out of this for a minute. Consider strictly private corporations with private motives (although federal oversight will likely be forced in any such case involving money) where integrating money with one’s face exists, and is being touted as the way of the future. Such is what some are saying, where payment will be given with facial recognition technology biometrics:
Biometric mobile wallets — payment technologies using our faces, fingerprints or retinas — already exist. Notable technology companies including Apple AAPL, +1.33% and Amazon AMZN, +0.55% await a day when a critical mass of consumers is sufficiently comfortable walking into a store and paying for goods without a card or device, according to Sinnreich, author of “The Essential Guide to Intellectual Property.”
Removing the last physical barrier — smartphones, watches, smart glasses and credit cards — between our bodies and corporate America is the final frontier in mobile payments. “The deeper the tie between the human body and the financial networks, the fewer intimate spaces will be left unconnected to those networks,” Sinnreich said.
People speak of the “mark of the beast,” but how much closer does one have to get? The future is being touted as one where “payment” of anything will be the selling of one’s own personal body through a machine, of which one’s own bodily “data” is now considered government or corporate property.
Aside from the moral problems this raises, there is also the issue of what happens if such data is stolen, or in the case of the corporate world as it always happens- “mishandled” or sold to people who use it to scam?
Can a person “unmake” his face? Yet what happens if his “facial imprint” is stolen? How does he ever recover his data?
Consumer advocates are also worried about biometrics being used for commercial purposes. Three states — Washington, Texas and Illinois — have enacted statutes governing biometric information privacy. “The current lack of regulation is surprising given that biometric information is permanent and unique to each individual and, thus, creates a concern for identity theft,” Zimmerman said. Other states have proposed bills for such laws. (source, source)
Yes, it is about permanent theft, for it is impossible to “undo” ones face, let alone other “biological imprints,” once they are “stolen” and having been linked to sensitive financial information.
As one can see, these trends are not “separate” even though they are, because it is the combination of new “technologies” introduced in society that is resulting in the creation of a China-like network of social credit.
All one would need to do is “pull” data from these items, then place them into a centralized “system” at which time a metric could be given to them that is not revealed to the public, and this metric would then be used to calculate a “score” for each person. Of course, it would be wholly arbitrary and subject to political manipulation. One will make certain that “enemies” will have low scores and that “good people”, or those who have power and regardless of how evil they are, always have “high” scores.
The result will be the same as in China, but whereas in China while the restrictions are “strict” and from without, in the American case they will likely be the isolation of “bad” people to the economic and social periphery while the “good” people are concentrated into the urban areas with more opportunities (save for “bad” areas), creating two nations within one, and bringing about the world that Huxley saw coming into formation.