Notes from the 2019 PCI Community Meeting

By Tony Fulda on September 25, 2019 (Last Updated on June 25, 2021 )

Get latest articles directly in your inbox, stay up to date

Back to main Blog
Tony Fulda

The Online Team and I had a great time at the PCI Community meeting last week, set in the spectacular environs of Vancouver BC. We ate and drank, pontificated, watched ferries and seaplanes come into the harbor (my inner 8-year old self couldn’t resist and I booked a flight out on one), and had a generally spectacular time networking with old and new friends in the payment security space.  While there were far too many interesting presentations and conversations to put into one place, I had a few takeaways that I felt were worth sharing. In no particular order:

PCI-pic3

Kicking things off:

This year marked the 13th(!) anniversary of the PCI North American Community Meeting. A lot has changed (mostly for the better) over the years, and the DSS keeps evolving in response to new technologies, interpretations of the Standard, and the threat landscape. Lance Johnson came right out of the gates to talk about the SSC’s current mission and mandate, which is:

  • Increase Industry Participation and Knowledge
  • Evolve Security Standards and Validation
  • Secure Emerging Payment Channels
  • Increase Standards Alignment and Consistency

I was struck this year by the SSC’s focus on the last point (integration of the multiple standards); there has obviously been a very thoughtful and concerted effort to unify the frameworks (PCI, PA-DSS, P2PE, EMV, etc.) to provide a continuum of security. Compared to some of the previous years’ meetings, the overall approach and tone seemed much more collaborative and less “top-down”, with the SSC very consciously acknowledging that the Assessors and Stakeholders on the front lines are the ones driving payment security (and the SSC’s role being more as a facilitator than the “bad cop”).

Sneak Peek of PCI DSS 4.0

Later in the day we got to the main attraction: Emma Suttcliffe provided a sneak peak of PCI v 4.0, and it’s definitely going to change the way some organizations approach their PCI compliance. While it didn’t appear that there are going to be any Earth-shattering changes to the overall requirements, there was mention that every requirement will be updated; this may include changes to the actual requirement numbers. The overall structure of the DSS will be retained, however, requirements are going to be organized into security objectives and requirements will be refocused as outcome-based statements. Each requirement will have clear identification of intent and expanded guidance. Some other changes may likely include:

  • Improvements and clarifications on scoping
  • Further requirements for “Business as Usual” to ensure that security is a continuous process
  • Increased training requirements (including anti-phishing?)
  • Changes to password requirements based on emerging standards and industry guidance
  • Updates to language intended to make sure all payment technologies are included in the language and scope (i.e. cloud and mobile deployments)

The most interesting potential change is to the reporting itself (at least for the Report on Compliance), which will provide some options in how an organization reports status for each DSS control. The Council showed us a preview of the new proposed reporting template, which would allow for “mature organizations” to meet a requirement through a more flexible “Customized Implementation” approach. Each control will have a checkbox indicating whether the control is being met by the “Defined” approach (basically the 3.2.1-type requirement and testing procedure) or a “Customized Implementation” (which is more, dare I say, risk-based and focused on the intent of the requirement). The “Customized” approach would allow for the assessed organization to implement a control that meets the intent of the requirement, and the QSA to test and document the control in a more flexible manner. When using the “Customized” approach, the organization would:

  • Provide documentation describing the customized implementation
  • Define the who/what/where/when/how of the control
  • Provide evidence the control meets the stated intent of the DSS, as well as how the control is maintained

The QSA would:

  • Review the organization’s evidence
  • Develop an agreed-upon test procedure and then document the results demonstrating that the control has been met

If this sounds like the Council is just incorporating compensating controls into the body of the DSS you’re kind of right, but not quite: it appears that this approach is intended to streamline reporting, get rid of the stigma of using a compensating control, and allow for more customized reporting and testing procedures without the need to define a technical or business constraint. However, based on all of the extra work of defining/generating evidence, creating testing procedures, and justifying the intent and risk of each control I would foresee most organizations sticking with the tried and true “Defined” approach in most cases. It should also be noted that this is not an “either-or”, as a combination of “Defined” and “Customized” controls could be used in a single assessment.  

I look forward to seeing how this will be implemented in the new templates, and I can already predict many questions coming up around this approach (who will ultimately determine risk, who is on the hook in the event of a breach etc. if a “Customized Control” fails, how should maintaining and auditing a customized control be done when no time-based criteria are defined, will the Card Brands and SSC add more capacity to answer potentially more subjective questions related to intent and testing etc.).  Overall, I feel like the changes make sense, and this seems like a natural progression towards making the DSS more risk- and less controls- based (though no surprise, the Council still wouldn’t explicitly say that this new methodology represents a “risk-based” approach).

In line with my comment earlier about the SSC’s more collaborative approach this year: they indicated that the 4.0 templates are still very much a work in progress and they will be actively soliciting feedback through the RFC process throughout October and November 2019, with much more collaboration and transparency from the QSAs, Participating Organizations, and ASV communities. Stay tuned, this will be interesting…

Misc. Meeting Observations:

Gill Woodcock did a really cool presentation about the PFI (forensics) work that the SSC has been working on since 2015. I somehow didn’t know that this was an ongoing project, and the results were illuminating. The SSC has been collecting redacted PFI reports from all cardholder breaches going back to the beginning of the PFI program and tracking the trends and commonalities, with the goal of determining where efforts should be focused and how the SSC’s standards can be improved. The main takeaways I got out of this:

  • By a considerable margin, most of the breaches that have occurred to this point are the result of control failures in Requirements 6 and 8 (patching, secure development, and access controls). After all this time, unpatched systems, poor application hygiene, and default/shared passwords continue to lead to a discouraging number of data breaches
  • Gill also showed the failing controls that were not the primary cause but a “contributing factor” in compounding the impact of the breach – these were primarily related to requirements 10 and 11 (monitoring/alerting, and vulnerability testing)

Based on this, you should not treat all controls as equally important; pay (much) closer attention to your patching, secure development, and testing if you don’t want your organization’s redacted report being combed through by the SSC’s forensics team.  And remember, ALL controls must be in place for you to be PCI compliant!

From the “It’s as bad as you think..” department:

Chris Novak and Joshua Costa presented the findings from the 2019 Verizon Data Investigations Breach Report (always a great combination of enlightening and terrifying). Good news: point-of-sale attacks and skimming/shimming are down somewhat. Bad news: phishing and email-based malware attacks are still alive and well (though click rates on phishing emails went down a bit).  They also showed a real-world attack that was decidedly low tech, which involved a bad actor setting up a fake domain/email to man-in-the-middle a company’s Accounts Payable department (making off with hundreds of thousands of dollars with just a few day’s “work”). Takeaways:

  • Trust but verify. Put multiple checks into your processes, especially when a lot of money is controlled through a single point of failure
  • As technology gets better (and environments harder to breach) bad actors are going to continue to go after soft targets (like trusting end users). Awareness and phishing training are more important than ever
  • Note that a presentation from the Visa security conference showed that many breaches came from two-year old vulnerabilities
  • Employ defense in depth strategies. When Chris and Joshua showed the number of steps involved in a successful attack, the highest success rate was in the “5-steps or less” category (by a lot). The harder a bad actor has to work the less likely they are to bother; take a “kill chain” approach to security, and put multiple detective and preventative controls around critical systems and processes.

Elementary my Dear

Finally, Online’s very own Rob Harvey, Adam Kehler, and Boyd Clewis wrapped up the festivities with an extremely entertaining presentation about the difficulties of tracking user actions when an organization is highly dependent on using Root/Admin/sudo for system administration. Multiple commands were run using sudo, and then Boyd put on his Sherlock Holmes hat (for real 😊) as the team did a live log review demonstration showing how difficult it can be for an organization to attribute an action to an individual if they are not using some other form of authentication and correlation. And even though there were only two test users in the demo, “Bill” and “Ted”, our team showed how difficult it can be to track actions or discover bad actors when a large number of users are running in a privileged or root context. 

When I multiplied the results in my head across a large organization, it really brought home the risks of using shared and root accounts and demonstrated how important well-tuned event management and secure user access is in protecting the PCI environment. Their demonstration tied in really well to Gill’s previously mentioned presentation (i.e. shared and generic accounts being a primary cause of data breaches), and showed why using shared and generic accounts without appropriate correlation and user management controls should be keeping organizations up at night.

Interesting stuff, and it was great to see everyone. Remember to stay safe, patch your systems, change your passwords,
and buckle up for 4.0!

 

INspectorBoyd PCI-pic2
PCI-pic3
PCI-pic1

 

Submit a Comment

Get latest articles directly in your inbox, stay up to date