what could kill nstic
play

What could kill NSTIC? A Friendly Threat Assessment In Three Parts - PowerPoint PPT Presentation

What could kill NSTIC? A Friendly Threat Assessment In Three Parts January 2013 Phil Wolff Strategy Director Personal Data Ecosystem Consortium phil@pde.cc. @evanwolf. Linkedin Download the whitepaper: http://pde.cc/nsticrisks High hopes for


  1. What could kill NSTIC? A Friendly Threat Assessment In Three Parts January 2013 Phil Wolff Strategy Director Personal Data Ecosystem Consortium phil@pde.cc. @evanwolf. Linkedin Download the whitepaper: http://pde.cc/nsticrisks

  2. High hopes for an ID ecosystem Can we get to an international digital identity system?

  3. High hopes for an ID ecosystem Can we get to an International, User-Centric digital identity system that works across Industries? Cultures? Technologies? Governments? Regulatory schemes?

  4. High hopes for an ID ecosystem This effort is driven in the United States under a 2004 program initiated by the National Strategy for Trusted Identity in Cyberspace through the National Institute of Standards and Technology (NIST) of the US Department of Commerce.

  5. Our findings, in short: The two most serious threats to NSTIC’s success: a user experience imbalance among that doesn’t work the forces that hold an identity ecosystem together.

  6. A dozen of us met • to list and score threats to the NSTIC Identity Ecosystem vision. • Internet Identity Workshop, Mountain View, California – October 2012 – May 2011.

  7. We asked: If NSTIC fails by 2016, what could have brought it down?

  8. HERE’S OUR HYPOTHETICAL FAILURE SCENARIOS 2016 POST-MORTEM OF NSTIC

  9. We spoke in the past-tense as if the failures had happened.

  10. We didn’t cooperate to build an ID ecosystem. We should have played well with others.

  11. We didn’t cooperate to build an ID ecosystem. We should have played well with others. Took too long. Strung out by process problems. (Alternatives emerged.)

  12. We didn’t cooperate to build an ID ecosystem. We should have played well with others. Industry failed to build it. (Capital and management didn’t prioritize.)

  13. We didn’t cooperate to build an ID ecosystem. We should have played well with others. NSTIC community became balkanized. NSTIC community lost cohesion; didn’t listen to each other. (Little to no interop.)

  14. We didn’t cooperate to build an ID ecosystem. We should have played well with others. The program was co-opted by a Big Brother government. (Not trustworthy internationally and for many purposes.)

  15. Just because it’s built doesn’t mean they’ll use it.

  16. Just because it’s built doesn’t mean they’ll use it. Worked, but was not trusted. (Failed Brand).

  17. Just because it’s built doesn’t mean they’ll use it. Was subverted and insecure. (Legitimately Untrusted).

  18. Just because it’s built doesn’t mean they’ll use it. Enterprise didn’t adopt it. (Business case not well made.)

  19. Just because it’s built doesn’t mean they’ll use it. After one failure, supporters abandoned the project. “Burned once, twice shy.” (Shallow, brittle commitment; low tolerance for failure.)

  20. Just because it’s built doesn’t mean they’ll use it. The IE was an empty room. No critical mass formed. There was an imbalance of supply and demand. (Anchor tenants didn’t sign on. Institutions didn’t enroll millions of users or pull in industry ecosystems.)

  21. Just because it’s built doesn’t mean they’ll use it. Citizens didn’t want trusted identity. (Poor market fit; lack of perceived benefit over alternatives.)

  22. We didn’t build the right things the right way.

  23. We didn’t build the right things the right way. A local failure took down the whole identity ecosystem. (Failures of ecosystem trust, architecture, integration testing, and risk analysis.)

  24. We didn’t build the right things the right way. The IdP/RP/Trust identity model was inferior to newer models. (Technology risk.)

  25. We didn’t build the right things the right way. The IdP/RP/Trust identity model broke at scale or broke in diverse contexts. (Project design risk.)

  26. We didn’t build the right things the right way. Miscommunication within the Identity Ecosystem contributed to its death. (Poor cooperation, weak community, high self-interest, low trust.)

  27. Failed User Experience.

  28. Failed User Experience. UX was too hard.

  29. Failed User Experience. Everything went wrong that could go wrong.

  30. We Built-In Structural Instability.

  31. We Built-In Structural Instability. Along with user experience, structural instability was the big issue, according to the group…

  32. We Built-In Structural Instability. • Four pillars of the ecosystem must be strong • Technology • Economics • Policy • Culture • Each relationship among them was imbalanced.

  33. We Built-In Structural Instability. Each of these pillars were operating on different tempos. • It was fast to iterate improved user experiences but slow to socialize each round among public policy and enterprise lawyers, for example.

  34. We Built-In Structural Instability. Motivations were misaligned. • Some companies, for example, engineered tariffs for data sharing into their terms of service, cutting off public sector and NPOs from their customers.

  35. We Built-In Structural Instability. Core ideas didn’t survive translation. • Several large Internet engineering companies backed out of supporting IE infrastructure because the “Easy ID” brand became a running joke on sitcoms, SNL, and a biting meme on YouTube.

  36. We Built-In Structural Instability. Liability was broken. • Tragic risks were taken with some technologies and contracts by pushing exposure from those who enabled risk to those who didn’t.

  37. This session was in October 2012. • But wait, there’s more…

  38. We did a similar exercise 18 months earlier in May 2011 with a similar group. https://secure.flickr.com/photos/philwolff/5713880402/ cc-by Phil Wolff 2. EIGHTEEN MONTHS E 2. EIGHTEEN MONTHS EARLIER... ARLIER...

  39. Key Risks (via 2011) :

  40. Key Risks (via 2011) : Lack of adoption.

  41. Key Risks (via 2011) : Impatience for long learning curve.

  42. Key Risks (via 2011) : Usability failures. (early concern)

  43. Key Risks (via 2011) : Interop failures.

  44. Key Risks (via 2011) : Overscope.

  45. Key Risks (via 2011) : Security problems like phishing and malware drawn by money.

  46. Key Risks (via 2011) : Perversion of principles.

  47. Key Risks (via 2011) : Chicken vs. Egg problems.

  48. Key Risks (via 2011) : Short Attention Span and the Hype Cycle

  49. Key Risks (via 2011) : Regulatory blocks privacy laws antitrust concerns uncertainty about liability

  50. Key Risks (via 2011) : Waiting for Winners

  51. Key Risks (via 2011) : Dystopian Fear

  52. Key Risks (via 2011) : Over-promising by tech communities to policy communities

  53. Key Risks (via 2011) : • Lack of adoption. • Chicken vs. Egg problems. • Impatience for long • Short Attention Span and learning curve. the Hype Cycle • Usability failures. • Regulatory blocks including privacy laws, • Interop failures. antitrust concerns and • Overscope. uncertainty about liability • Security problems like • Waiting for Winners phishing and malware • Dystopian Fear drawn by money. • Over-promising by tech to • Perversion of principles. policy communities

  54. We had time, in the 2011 session, to brainstorm what might avoid or mitigate these threats.

  55. Action: Small successes Build confidence

  56. Action: Industry marketing, PR, Media/Voice Build public understanding

  57. Action: Community user experience sharing (KM) Accelerate design

  58. Action: Cultivate Engineering Focus Developer relations

  59. Action: Governance driving Interop Testing Interop is a leadership challenge

  60. Action: Clear/Graded Roadmap Short term plans, long term visions

  61. Action: Electronic Authentication Guideline, NIST SP 800-63, and other threat comment Connect to existing NIST processes

  62. Action: Security Council / Antiphishing Working Group Make security an explicit IESG activity

  63. Action: Government Affairs activity Engage US and other governments

  64. Action: OIX Risk Wiki Engage the OIX community

  65. The fear of “failure to deliver” was still there. WHA WHAT CHANGED T CHANGED BETWEEN THE TWO BETWEEN THE TWO SESSIONS? SESSIONS?

  66. What changed between the two sessions? 1. Outside forces like dystopian fear among users, security failures, and regulatory challenges were less prominent or not mentioned. � 2. Drivers of failure expanded almost exclusively to internal ones. �

  67. What changed between the two sessions? The primary concern: leadership Once funding, staffing, and collaboration started: the identity ecosystem did not take charge and master the challenges as they emerged.

  68. 3. Last minute update... Arbroath Cliffs Warning Notice CC-BY-NC Alan Parkinson

  69. Cuts are coming • US federal government is cutting spending in 2013 as we prepare this paper in December 2012. • By cleaver if a “fiscal cliff avoiding” budget is passed • By chainsaw if Congress and the President fall over the “cliff.”

  70. Direct effects. Nobody knows if this will directly affect NIST and the NIST staff managing the NSTIC project.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend