UK: IoT security-by-design report and draft Code of Practice (devices, IoT, mobile apps)

“The report and draft Code of Practice advocates a fundamental shift in approach to moving the burden away from consumers having to secure their devices and instead ensure strong cyber security is built into consumer IoT products by design.

The draft Code of Practice for industry contains 13 practical steps to improve the cyber security of consumer IoT.”

Audience of the draft CoP is Device Manufacturers, IoT Service Providers, and Mobile Application Developers (!)

https://www.enisa.europa.eu/news/member-states/uk-government-published-security-by-design-report

 

Blackbox extraction of secrets from deep learning models

Fascinating paper: “The Secret Sharer: Measuring Unintended Neural Network Memorization & Extracting Secrets”, Nicholas Carlini, Chang Liu, Jernej Kos, Úlfar Erlingsson, Dawn Song at https://arxiv.org/abs/1802.08232

Turns out that your algorithm memorizes your secrets in the training data. -Even if the algorithm is a lot smaller than the actual secrets… – My jaw fell do the ground right here :

“The fact that models completely memorize secrets in the training data is completely unexpected: our language model is only 600KB when compressed , and the PTB dataset is 1.7MB when compressed. Assuming that the PTB dataset can not be compressed significantly more than this, it is therefore information-theoretically impossible for the model to have memorized all training data—it simply does not have enough capacity with only 600KB of weights. Despite this, when we repeat our experiment and train this language model multiple times, the inserted secret is the most likely 80% of the time (and in the remaining times the secret is always within the top10 most likely). At present we are unable to fully explain the reason this occurs. We  conjecture that the model learns a lossy compression of the training data on which it is forced to learn and generalize. But since secrets are random, incompressible parts of the training data, no such force prevents the model from simply memorizing their exact details.”

https://arxiv.org/pdf/1802.08232.pdf

ENISA: Handbook on Security of Personal Data Processing

“The overall scope of the report is to provide practical demonstrations and interpretation of the methodological steps of the ENISA’s 2016 guidelines for SMEs on the security of personal data processing. This is performed through specific use cases and pragmatic processing operations that are common for all SMEs.”

https://www.enisa.europa.eu/publications/handbook-on-security-of-personal-data-processing

[UK] ICO’s Liz Denham on direct marketing consent

https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2018/02/dma-data-protection-2018/

Detail of the e-privacy regulation is still being debated, but a default for all consumer marketing to be opt-in is in the current draft.

Until the e-privacy regulation comes into force, PECR will sit along side the GDPR.

That means electronic marketing will require consent. Yes, there is potential to use legitmate interests as a legal basis for processing in some circumstances, but you must be confident that you can rely on it.

It seems to me that a lot of energy and effort is being spent on trying to find a way to avoid consent. That energy and effort would be much better spent establishing informed, active, unambiguous consent.

You say you will lose customers. I say you will have better engagement with them and be better able to direct more targeted marketing to them. You will have complete confidence that your customers have given informed consent.