0% found this document useful (0 votes)
143 views47 pages

Lessons Learned Applying ATT&CK - Based SOC Assessments - PDF

Uploaded by

Melvin Spek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
143 views47 pages

Lessons Learned Applying ATT&CK - Based SOC Assessments - PDF

Uploaded by

Melvin Spek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 47
Lessons Learned Applying ATT&CK- Based SOC Assessments Andy Applebaum @andyplayse4 SANS Security Operations Summit June 24%, 2019 MITRE Challenges in the SOC Challenges, n=239 Qs seayeropam Lack of skilled sta Lack of automation and orchestration ‘Too many tools that are not integrated Lack of processes or playbooks Lack of enterprise wide Visit) ——— Lack of management support ———— Too many alerts that we cart ook into (ack. Silo mentality between security, Ik 2nd. — High staffing requitement: — Lack of context related to what we are secing — Regulatory or legal requirements = Other © 20 40 60 80 100 120 140 160 Background: What is ATT&CK? The Pyramid of Pain ATT&CK. ugh! ATT&CK™ is a globally-accessible. *Challenging knowledge base of adversary ta Annoying and techniques, developed by based on real-world observati, *Simple adversaries’ operations. Easy eTrivial Source: David Bianco Pie The ATT&CK Matrix Grounded in real data from cyber incidents Focuses on describing adversary TTPs, not loCs Decouples the problem from the solution (also has information on groups and software) Core ATT&CK Use Cases Detection Threat Intelligence processes = search Proceasicreste fag: filter processes sere (exe == "reg.exe” and pareat_exe cad = filter processes where (ere e+ Tend.exe™ and Fagland'ced = join (reg, cd) ware (rep.ppld = cadpié and Sitpet regtndiend Assessment and Engineering Starting with ATT&CK: Understanding Detection Gaps We have some confidence we wou! “tg, Knowledge of my detection gaps allows me to... and Control channel We have high confidence we would Scheduled Transfer if executed ES | MITRE This Talk: Getting Towards Detection Gaps = Our experiences are from running A7T&CK-based SOC assessments — Short, rapid-fire methodology to approximate detection gaps in a SOC = Lessons learned from running these assessments, applicable to: — Third-party or in-house assessment — “Paper” assessments or hands-on ones — General ATT&CK integration = Why you should care — ATT&CK can help solve some of the hard problems — but there are tips, tricks, and pitfalls in trying to use it to do so MITRE Getting Started: Using ATT&CK for Assessments MITRE Bringing ATT&CK into the SOC Solution: ATT&CK-based SOC Assessments Third-party Assessment Team Target SOC Environment Detection Heatmap MITRE Solution: ATT&CK-based SOC Assessments Internal Assessment Team Target SOC Environment Detection Heatmap MITRE Enter: ATT&CK-based SOC Assessments = Methodology to map the SOC’s detection abilities to ATT&CK ~ Paint broad strokes of detection capabilities — Provides a rapid, first-look view into SOC's current state Useful for SOCs wanting to integrate ATT&CK into day-to-day operations = Procéssydetails: ‘ ands-on-systems Output: ATT&CK detection heatmaip, prioritizationplan, recommendations rw TRE Groen, 2 9: ee co MITRE Experiences with ATT&CK-Based SOC Assessments = First run in 2017 = Since then: We’ve learned a lot along the way... * Lessons for third-party assessors * Lessons for in-house assessors * General lessons on using ATT&CK = What outcomes have we had? — More structured analytic development programs — General growth: tooling, data collection, processes i141 Conducting an ATT&CK-based SOC Assessment 1. Setting the Stage MITRE Setting Expectations = ATT&CK’s popularity has led some to treat it as a silver bullet — People often have skewed expectations of what performing an ATT&CK assessment provides — Applies to assessments done by third parties, as well as those conducted in-house : if you're bringing ATT&CK into a SOC, make sure you set the right MITRE Messaging ATT&CK-Based SOC Assessments = The word “assessment” can sometimes have a negative connotation — Assessments are often used as ways to gauge your skills/progress — Cyber mentality is often that assessments are antagonistic = The assessor is painting a picture of fault for the assessee = Risks of running an “assessment”: Staff might not comply with the process => _| Effort requires more effort/time _Staff might worry about how results will be used ==) _Nitpicks details, wordsmiths report | “Personnel might misrepresent/exaggerate current capabilities, Ss) {Yields inaccurate results| Leadership may overreact to results =) End up causing damage, not good MITRE Tips for Staging ATT&CK-Based SOC Assessments 1. Consider using a phrase other than assessment 2. Make sure leadership understands the point of the assessment, and that the assessment aligns with their goals 3. Position the assessment as a stepping-stone to improvement; not as a way to gauge performance 4. Ensure SOC staff know they're not being evaluated, rather the SOC’s policies, procedures, tooling, etc. are. 5. Prepare to follow-up after running an assessment MITRE 1181 Conducting an ATT&CK-based SOC Assessment 2. Getting Data: Tools, Documentation, and Interviews MITRE Analyzing Tools + Documentation 1. Map each tool to the data sources they may detect, and the data sources to the techniques in ATT&CK — Can be useful approximating coverage when documentation is sparse 2. Analyze each analytic — will it detect a behavior or is it a static signature? What techniques can it detect? 3. Looking at documentation — find standard processes and procedures, mapping them to ATT&CK whenever possible 1. Example: account lockout policy? This can impact Brute Force = For each component: create a coverage heatmap to track your work MITRE 1201 Past Documentation: The Importance of Interviews RSet cacy J ‘+ Many SOCs are using tools that they haven’t documented (yet) + Some tools may be used differently in practice than in theory SCONE Soa ‘+ Most people configure tools and develop pipelines specific to their usage * Tools can be modified with vendor modules/add-ons, or by the end-user Ce WE a eee eee) * Documentation often lacks direct ATT&CK mapping which can be hard to infer * Documentation can be ambiguous; interviews tend to provide more specifics Tips for Conducting Interviews 1. Break questions down by team 2. Walk through examples: how would you detect lateral movement? — If these questions go well, start scoping to tactics and techniques — If they don’t, try asking general questions 3. Ask each team what their favorite tool is, and why — How often do they use it? What do they look for? 4. Come prepared — but be prepared to change your script MITRE 1221 Conducting an ATT&CK-based SOC Assessment 3. Producing the Heatmap MITRE Pick a Good Scoring Scheme For Your Heatmap Legend Some Confidence of Detection Low Confidence of Detection ‘No Confidence of Detection ‘Static Detection Possible MITRE Pick a Good Scoring Scheme For Your Heatmap Rdgerdiecs ofaNpracKheromese, have a . BORN te ene that are relevant to your . aoe sie at ape types (confid: + like Bre 2 i) ch joose good color nee (gradient, discrete, = Settle on something that conveys the right information at the right layer — Removing just one category has significant communication impacts MITRE High Confidence of Detection Legend |Some Confidence of Detection aBSSume it’s saying “Look here! This is a really big problem!!!” it only as needed to call attention to specific areas that should be focused on Heatmaps: Avoiding Red Legend Hiveontdence of Derecion + Conveys the same message, but easier to digest * Positions the results as less antagonistic: these are areas of improvement, not failure * Even outside of assessments — be cautious when using red Being Realistic: Heatmaps are not Axiomatic "= Coverage heatmaps are great a — Easy to understand; tangible and straightforward tS=275- = — Provides “high level” picture; useful to all staff = = but... 1. Coverage doesn’t always align with how attacks are executed in practice — Techniques can be executed in many ways, with different detections for each ~ Per-technique detection isn’t always the right level of abstraction 2. Coverage is not static: what's green today could be gone tomorrow! — Attacker TTPs and defender practices rotate; don’t ignore what you cover today 3. Remember: ATT&CK heatmaps are almost always approximations If you're doing this as a third-party, make sure the SOC knows this — If you're doing this in-house, make sure colleagues and leadership understand MITRE Complement Your Heatmap with Prose 1. If doing an assessment — don’t just hand off a heatmap, describe it 2. Write up a short summary: — What were some notable ATT&CK strengths? — What were some notable ATT&CK gaps? — Talk at the tactic level, but refer to relevant and important techniques 3. Don’t stop at ATT&CK — Summaries are great for the heatmap — but include information on general trends observed as well MITRE 1291 Conducting an ATT&CK-based SOC Assessment 4. Delivering Results MITRE Try to Focus Prioritization Legend Low Confidence of Detection Try to Focus Prioritization popular os cr Existing logs can be used to detect Remote File Copy and Data From Removable Media, making analytic development easier Legend Sonetoniencestoceton | Prioritized Technique MITRE 1. Small lists of techniques are great for short-term wins 2. Follow one of two paradigms: — A technique or two across tactics, or — Many techniques in one tactic 3. Focus on techniques that are immediately relevant — Are they used by relevant threat actors? — Are they popular or frequently occurring? — Are they easy to execute and do they enable more techniques? — Are the necessary logs readily accessible? MITRE Give Tangible Recommendations It’s easy to give recommendations! ..-but it’s hard to give targeted ones (and those are the most helpful!) Consider giving: 1. Short- and long-term recommendations 2. Examples and starting points — Techniques to focus on for analytics — Threat groups to emulate for adv, emulation — Reading material to help get started 3. Prioritized recommendations for triage ‘Afr anayics Valsting your coverage wth offensive Sample Recommendation: Adding Analytics @ Start with an initial assessment @ Focus on high priority techniques o Remote File Copy © Windows Admin Shares © Valid Accounts e Update coverage map © Remote File Copy: Low to High © Windows Admin Shares: Low to Some es © Valid Accounts: Low to Some — MITRE Summary: Addressing Hard Challenges MITRE Revisiting the Hard Problems 1361 Challenges, n=239 Lack of skilled sta Lack of automation and orchestration ‘Too many tools that are not integrated Lack of processes or playbooks Lack of enterprise wide visibility Lack of management support ATT&CK. High staffing requirements Lack of context related to what we are seeing Regulatory or legal requirements ‘Other ‘Too many alerts that we cant look into (lack. Silo mentality between security, IR and... Qe esninon 8 8 8 = 8 | With Knowledge of My Detection Gaps, I can... Too many alerts that we can’t look into ‘Too many tools that are not integrated = "Map tools to ATT&CK to see overlaps Lackof management support | mm (SRR UinS RMSE] Lack of enterprise wide visbity | mm [Agta ssimaps to see ener overage) fmm) Prioritize alerts based on ATTECK mapping. Silo mentality between security, IR and..| tmp Use ATTECK as a common language Lack of context related to what we are seeing | lm |Enrich alerts with relevant TTP info MITRE How an Assessment Helps Assessment side-effect: producing tool heatmaps rao Bie took thet are not Intssrated Fe arn Lack of Heatmaps are easily digestible and show progress Eee Assessments provide aggregate coverage charts Prioritization can identify high-impact TTPs Assessments help orient teams to the same page Side-effect: mapping analytics/alerts to TTPs Lack of enterprise wide visibility ‘Too many alerts that we can’t look into Silo mentality between security, IR and.. Lack of context related to what we are seeing MITRE Soundbytes and Takeaways Make sure you’re setting the right expectations — ATT&CK — and assessments — are not a silver bullet Your coverage isn’t just your tools — it’s your people and your processes too Create heatmaps that convey what you want to convey — and don’t use red! Don’t stop with a heatmap — Identify key techniques to prioritize in the short term — Have a set plan — or a set of recommendations to follow-up on MITRE 1401 Long-term: Following Up After an Assessment ATT&CK". + Relevant threat models “S, + Sightings data my ‘ATBCK-Evauatons sao eeu ii cor acta eM + ATT&CK evaluations Pees + Public resources eee uni foomsrr 4 ATT&:CK-Based soc ) oem ATT CK Adversary behaviors ntsc MITRE Links and Contact = Andy Applebaum = CALDERA — [email protected] — https://github.com/mitre/caldera ~ @andyplayse4 = ATT&CK-based Product Evals = ATT&CK — https://attackevals.mitre.ora/ — https://attack.mitre.org — @MITREattack = ATT&CKcon — [email protected] — https://www.mitre.ora/attackcon " Data + Code = BI ~ https://aithub.com/mitre/cti (STIX data) __https://medium.com/mitre-attack — https://aithub.com/mitre-attack (code) MITRE Backup MITRE MITRE’s Public ATT&CK Resources Adversary Emulation Plans Public ATT&CK Knowledge Base attack mitre.org Structured Content ATT&CK Navigator “<@ USies REE MITRE ATT&CK in the Community orgs contributing to ATT&CK! + Man Homewod tasomna Scuty Sa, rn Siete egencue Redigus Qugencies, Na {Baa mart Ensen en , Spampbets “alpaca, Acco lob Sect + Yonatn ot, Dep Instinct Who's using ATT&CK? Job postings on Indeed as a proxy for usage Financial Security Media cman = SF Fed Reserve * RevSec * NBCUniversal = Bank of America * FireEye * Nielsen = JP Morgan = AppGuard = Cox PRapraliraetphropmatoecelal iad = FS-ISAC = CrowdStrike = Comcast 'B1SO Operations Knowledge Manager = Experian = CyberSponse ane Anenes Carton #3547 2240 = FreddieMac —* Verodin Others =; = BNY Mellon = General Electric 2 we = US Bank Retail * Deloitte ee = Target = Pfizer Se erat Tech = Best Buy = GSK = Microsoft = PepsiCo = Marathon feetorrgree oop py = Intel = Under Armour = UnitedHealth = olbnb Cuesta esse ees . rae Casaena ..and others! = Uber = Booz Allen cDW = Leidos ATT&CK in the Community 89 individuals + orgs contributing to ATT&CK! ATT&CK (and interest in ATT&CK) has grown Interest overtime Wu sean) 2014 2015 2016 2017 2018 2019 yl

You might also like