VulnCon, Big Brother, School Daze, and More - Jan 22nd-28th, 2024 - F5 SIRT - This Week in Security

Editor's introduction 

Has it been seven weeks already?  Time flies like an arrow.  And fruit flies like a banana.  Yes, that sense of humor means MegaZone is once again at the controls. 

Before I get to the usual, a sidebar - if you're a CNA, check your email.  Voting is now open for the CVE Board CNA Liaison position.  While I am one of the nominees for the position, I'm not asking you to vote for me, just to vote - this is an important role, providing a voice for CNAs to the CVE Board.  Voting closes February 8th - look for details in the email from the CVE Program Secretariat.

With that out of the way, let's take a quick look at what caught my eye in this week's security news feed, and that I had some thoughts on myself.

By the way, if this is your first TWIS, you can always read past editions.  And there is a lot of other content from the F5 SIRT to check out as well.

VulnCon 2024

I'm going to use my soapbox to plug CVE/FIRST VulnCon 2024, March 25th-27th, 2024 in Raleigh, North Carolina.  This is the inaugural VulnCon, and it is a joint effort of the CVE Program and FIRST.  It is also replacing the previously held annual CNA Summit, so if you're a CNA you're strongly encouraged to attend this instead.  I'll lift language from the site:

The purpose of the conference is to collaborate with various vulnerability management and cybersecurity professionals to develop forward leaning ideas that can be taken back to individual programs for action to benefit the vulnerability management ecosystem. A key goal of the conference is to understand what important stakeholders and programs are doing within the vulnerability management ecosystem and best determine how to benefit the ecosystem broadly.

Con registration and hotel information is available now, and the con is hybrid, so you can attend in person or virtually.  I'm also one of the organizing co-chairs, representing the CNA community for the CVE.org, so there's that as well.

Depending on when this issue of TWIS hits DevCentral the Call For Papers will either be just about to close, or have just closed, as that happens January 31st.  We've already started reviewing submissions and we'll be working on the program schedule in the coming weeks.  Stay tuned!

NSA Circumvents Warrants with Data Brokers

I'm not a fan of information hoovering, privacy invading data brokers.  I'm also not a fan of government overreach and deliberate efforts to circumvent citizen's rights.  So I was really not thrilled to learn that the NSA, and other US agencies, have been circumventing the need for warrants or court orders by turning to commercial data brokers to purchase location data, browsing habits, and other Internet tracking information rather than legally collecting it themselves.  Not only have they been circumventing the legal protections in place for citizens against abuse by such agencies, they've been contributing to the abusive data broker industry by funding it with tax dollars.  Lose-lose-lose for individual citizens.

The good news is that, in light of a recent FTC ruling, it looks like these practices may be ending now they they've come to light.

Failing Grades

CISA published a blog calling for university computer science programs to include security in their curricula, and I strongly agree.  Throughout my career my experience has been that developers, even experienced developers, often have not had had sufficient training in secure development practices.  This is especially true for new developers who come out of CompSci programs with skills to build functional software, but little to no knowledge of real-world security concerns and how they should influence software design choices.  We, as an industry, keep reinventing the square wheel with the same types of vulnerabilities year after year.  Universities could do a lot of good by stressing secure software development in their programs, to better prepare the next generation of developers.  Security is not a specialization - security is for everyone, all software, all products, and all developers.  It should not be an elective - it should be a requirement.

Do Not Fear CVE

Ivanti and Juniper Networks are catching a bit of heat from the security community over the way they've handled some recent issues and the CVE assignments, or lack thereof.  As someone who handles vulnerability management and disclosure, and is heavily involved in the CVE program, and has even written a couple of articles on DevCentral on the topic, I had some thoughts on this.  I feel like some of this comes from the fear of CVE, or the desire to minimize the number of CVEs that you publish as a vendor.  The whole concept of 'more CVEs are bad', to me, is just the wrong way of thinking.  If you think that way it can, consciously or not, lead you to possibly not disclosing issues that you should.  You may start to justify, even to yourself, why something should not be a CVE.  Or you may stretch the definition of a 'fix'. to stuff multiple issues into one CVE, rather than issue multiple CVEs.

As my colleagues would attest, my approach is more "Justify to me why this is not a CVE."  Why in doubt, I believe it is better to issue a CVE.  A CVE is just a unique identifier for a specific software issue.  It isn't a scarlet letter or badge of shame.  I don't enjoy publishing CVEs, but I view them as necessary and correct for open communication.  The correct, and only, way to publish fewer CVEs is to have fewer vulnerabilities to disclose, IMHO.  When I see large vendors with complex products with very few CVEs, frankly I'm skeptical.  Is it more likely that they've cracked the problem writing vulnerability-free complex software, or that they're just not issuing CVEs and disclosing the issues they've found?  (Or, worse, not finding the issues?)

Anyway, I encourage vendors and CNAs to not fear CVEs and to issue them when necessary.  And I'll continue to work to have F5 do the same.

What Will Kill Ransomware?

How many times have we covered ransomware in TWIS?  I know I feel like I've written something about it nearly every time I've been at the controls, and I've seen my colleagues cover it as well.  Every week I see articles on the latest ransomware attacks.  Ransomware is a regular subject in customer tickets.  We have regular corporate trainings and admonishments to be wary of it.  Ransomware is so pervasive that it feels just like a fact of life at this point.  But why?

Because it pays.  The only reason ransomware continues to persist, year in, year out, is that the organizations and individuals behind the campaigns continue to make money with it.  And as long as the financial incentive exists, so will ransomware.  I agree with the view that the only way to, if not end, then reduce the prevalence of ransomware is to starve the financial reward pipeline to remove the incentive.  Consider the figures from this Dutch study:

Among 430 victims from 2019-2022, 28% reported paying a ransom, with the average amount just over €431,000 (about $469,781) and the median €35,000 (about $38,138).

Companies with insurance paid on average significantly higher ransoms, of €708,105 (about $771,600) compared to $133,016 ($144,940).

Those kinds of payouts are a significant incentive.  And having insurance just makes it worse.  If I were of the evil mindset I'd certainly be more apt to target corporations I knew had cyber insurance.  And the problem is only getting worse:

Almost 5,200 organizations were hit by ransomware attacks in 2023, according to Rapid7. NCC Group concluded ransomware attacks increased 84% to almost 4,700 incidents in 2023.

Maybe it is time for some tough love, and a regulatory ban on ransomware payments.  Ransomware payments are directly funding cybercrime, and the funds are often believed to be funneled to other criminal, and even terrorist, activities.  A ban on paying ransoms would also apply to cyber insurance.  While the insurance could still cover losses from the impact of the ransomware, paying the ransom would become illegal.  Such a ban might force companies to redouble their efforts to better protect their networks and data against ransomware.  Better backup systems, antimalware, network partitioning, restricted permissions, etc.  Many security best practices are not employed because of the 'costs' of doing so.  But if the potential costs of not doing so increase, those implementation costs start to look more acceptable.

What is clear is that 'more of the same' just isn't working.  Ransomware is not only not going away, it is a growing problem.  What we're doing today is just making the problem worse.  It may be time to stop feeding the beast and to start starving it instead.

 

Until next time!

Updated Feb 05, 2024
Version 5.0
No CommentsBe the first to comment