homepage
Menu
Open menu
  • Training
    Go one level top Back

    Training

    • Courses

      Build cyber prowess with training from renowned experts

    • Hands-On Simulations

      Hands-on learning exercises keep you at the top of your cyber game

    • Certifications

      Demonstrate cybersecurity expertise with GIAC certifications

    • Ways to Train

      Multiple training options to best fit your schedule and preferred learning style

    • Training Events & Summits

      Expert-led training at locations around the world

    • Free Training Events

      Upcoming workshops, webinars and local events

    • Security Awareness

      Harden enterprise security with end-user and role-based training

    Featured

    Get a Free Hour of SANS Training

    Free Course Demos

    Can't find what you are looking for?

    Let us help.
    Contact us
  • Learning Paths
    Go one level top Back

    Learning Paths

    • By Focus Area

      Chart your path to job-specific training courses

    • By NICE Framework

      Navigate cybersecurity training through NICE framework roles

    • DoDD 8140 Work Roles

      US DoD 8140 Directive Frameworks

    • By European Skills Framework

      Align your enterprise cyber skills with ECSF profiles

    • By Skills Roadmap

      Find the right training path based on critical skills

    • New to Cyber

      Give your cybersecurity career the right foundation for success

    • Leadership

      Training designed to help security leaders reduce organizational risk

    • Degree and Certificate Programs

      Gain the skills, certifications, and confidence to launch or advance your cybersecurity career.

    Featured: Solutions for Emerging Risks

    New to Cyber resources

    Start your career
  • Community Resources
    Go one level top Back

    Community Resources

    Watch & Listen

    • Webinars
    • Live Streams
    • Podcasts

    Read

    • Blog
    • Newsletters
    • White Papers
    • Internet Storm Center

    Download

    • Open Source Tools
    • Posters & Cheat Sheets
    • Policy Templates
    • Summit Presentations
    • SANS Community Benefits

      Connect, learn, and share with other cybersecurity professionals

    • CISO Network

      Engage, challenge, and network with fellow CISOs in this exclusive community of security leaders

  • For Organizations
    Go one level top Back

    For Organizations

    Team Development

    • Why Partner with SANS
    • Group Purchasing
    • Skills & Talent Assessments
    • Private & Custom Training

    Leadership Development

    • Leadership Courses & Accreditation
    • Executive Cybersecurity Exercises
    • CISO Network

    Security Awareness

    • End-User Training
    • Phishing Simulation
    • Specialized Role-Based Training
    • Risk Assessments
    • Public Sector Partnerships

      Explore industry-specific programming and customized training solutions

    • Sponsorship Opportunities

      Sponsor a SANS event or research paper

    Interested in developing a training plan to fit your organization’s needs?

    We're here to help.
    Contact us
  • Talk with an expert
  • Log In
  • Join - it's free
  • Account
    • Account Dashboard
    • Log Out
  1. Home >
  2. Blog >
  3. Beyond Meh-trics: Examining How CTI Programs Demonstrate Value Using Metrics
John_Doyle_370x370.png
John Doyle

Beyond Meh-trics: Examining How CTI Programs Demonstrate Value Using Metrics

Cyber threat intelligence (CTI) teams are frequently asked to provide metrics for leadership to validate their contributions

January 9, 2025

This blog was cowritten by John Doyle, Gert-Jan Bruggink, Steven Savoldelli, and Callie Guenther.

Cyber threat intelligence (CTI) teams are frequently asked to provide metrics for leadership to illustrate their contributions in helping improve the organization’s security posture and reducing risks. However, most CTI programs, especially those starting down this path, tend to create weak metrics centered solely around production throughout. These are often viewed as “low hanging fruit” and often are misaligned with the intent behind the metrics creation ask. While performing metric evaluations on throughput may serve as an initial step, ultimately, CTI teams should aim to demonstrate meaningful insights that go beyond routine measurements. Metrics that genuinely reflect program impact and maturation require careful planning.

In this article, we examine why organizations struggle to conceptualize and develop effective metrics for CTI programs before presenting a practical guide for metrics categorization that CTI teams can leverage to develop meaningful program measures. Throughout this blog, we showcase examples of how CTI metrics align to actionable intelligence, risk reduction, and business impact using our categorization taxonomy. We conclude with parting thoughts, highlighting previous research produced specifically around CTI metrics.

Before delving into the blog’s substance, we’d first like to provide many thanks to Callie Guenther, Braxton Scholz, Chandler McClellan, Rebecca Ford, Katie Nickels, Brandon Tirado, Greg Moore, Jonathan Lilly, and others who helped start us down this path at the SANS CTI Summit 2024 workshop we hosted on how to build an effective CTI Program.

Demonstrating Value Through Metrics

Metrics are one modality that a cybersecurity function can – and often does – use to measure effectiveness and efficiency. Yet, measuring the value cybersecurity provides for an organization is a daunting, challenging, and cumbersome task with unclear benefits to most managers assigned to create them. Perhaps this is because of lack of exposure, training, desire, aptitude, perceived value, or any number of other reasons. That is, besides the glaring elephant in the room: it is challenging to show the value of improved decision-making, cost savings from mitigated incidents, and agility in responding to a dynamic threat landscape, especially in a quantifiable way.

To properly create metrics that showcase the value of CTI services, collaborative systems thinking is crucial. This approach should account for factors such as brand reputation, consumer trust, legal consequences, employee productivity, and morale, beyond the shortcomings associated with solely using the traditional cybersecurity triad of confidentiality, integrity, and availability. This process becomes even more complex when developing meaningful metrics for programs like CTI, whose role is to inform and enhance decision-making among defenders, risk managers, and cybersecurity leadership. Through it, though, we should strive to work with partner teams – stakeholders – to garner an understanding of the impact of our work, driving security outcomes for the organization. A unified story–often conveyed through metrics– can effectively demonstrate how the CTI function strengthens organizational resilience, reduces cyber risk, protects against regulatory fines, and safeguards brand reputation, thereby justifying the cost of staffing and maintaining a CTI program.

Figure 1: A Snippet from Gert-Jan’s Master CTI Metrics Matrix for Illustrative Purposes
Figure 1: A Snippet from Gert-Jan’s Master CTI Metrics Matrix for Illustrative Purposes

Purposeful Metrics: Setting Goals and Outcomes

Before cybersecurity and threat intelligence leadership rush to decree that all programs need metrics, a more effective starting point would be to determine what the program wants to measure, why, and what outcomes it will drive by capturing these measurements. Metrics should serve as a means to an end. Organizations should first establish the purpose of the metrics they intend to capture and clarify how they plan to use this information to drive business decisions. For example, for leadership and management it may be that we measure and demonstrate value to those who have a say in our funding and existence – namely highlighting our raison d'être. For our peers, it may be that we measure their expected outcomes and how CTI has aided in helping them achieve these.

  • In our experience, CTI programs that embrace solely weak metrics often lack clearly defined objectives for the CTI program, have a limited understanding of how to connect CTI activities to business outcomes, or genuinely struggle with the inherent difficult task of quantifying the impact intelligence has on improving the organization’s security posture.  This reliance on easily gathered data often fails to justify the program's value to leadership, hindering its growth and potentially leading to misallocation of resources.

The process of developing metrics demands an administrative cost that often exceeds merely collecting available data. For example, collecting relevant metric-supporting data may require building new technology, processes, and workflows to gather, store, and display metrics that serve a specific, exploratory purpose. Capturing metrics solely for their own sake is a misstep that can lead to wasted resources. Instead, data collection should support business outcomes and use cases. For example, analyzing the intelligence requirements the team serviced over a fixed period of time can allow leadership to determine whether out-of-band stakeholder re-engagement is required to determine if continuing support is required for the team to produce intelligence products on a given topic or if the team can shift its focus to other pressing needs.

It is also worth noting that metrics are not the only method to showcase program success; qualitative accomplishments that highlight value across the organization should also be celebrated.

Building and Evaluating Metrics: A Taxonomy for CTI Metrics

CTI teams, like many stakeholder-driven functions, face challenges in creating universally transferable metrics, which can vary by organization, scale, and stakeholder involvement. Below, we offer a taxonomy for constructing and evaluating meaningful metrics within CTI programs. This taxonomy serves as a foundation to guide teams in building metrics from the ground up, allowing the programs to elect one or more of these framings for use.

Building and Evaluating Metrics A Taxonomy for CTI Metrics.jpeg

By Function

CTI programs should be thinking about metrics development to drive action, but it is often difficult for those not trained in metrics design or the mission outcomes it enables through the CTI and cybersecurity lens to conceive starting points. We propose that organizations think of these in three broad categories: Administrative, Performative, and Operational. Some overlap may exist between performative and operational metrics, especially in evaluating investments in security controls, tooling, and external datasets. Metrics in each category serve unique purposes, from cost planning to gauging resource utilization.

  • Administrative Metrics: Administrative metrics determine cost expenditures on staff, conference and training budget, software licensing, and data set procurement. Administrative metrics can assist in determining which data sets were used to support relevant intelligence requirements over a given time period or enable comparative analyses such as evaluating the impact of adding a headcount to the team.
  • Performative Metrics: Performative metrics gauge throughput and measure level of effort required to complete tasks, supporting capacity baselining. This is useful for resource planning, baselining expectations for job role and level, evaluating an aspect of performance, and establishing individual and program-wide goals. Number of tickets created, active intelligence requirements, RFIs supported by type, rate of proactive vs. reactive delivery on threat activity information, adherence to internally defined quality standards, etc., are also performative metrics. While these metrics can provide a high-level overview, they may offer limited insights for driving specific actions or improvements and have sometimes been referred to as “vanity metrics,” often requiring further analysis in conjunction with other data to provide a complete understanding of workload or performance.
  • Operational Metrics: Operational metrics focus on impact to business operations and the functions designed to support the overall growth strategy. In CTI, we can use this categorization to illustrate how the services we provided helped drive down risk; informed cybersecurity strategy and planning; and enabled cyber defense actions. These metrics are rarely owned exclusively by the CTI team, and are often collaborative in nature, working with teams specialized in quantifying cost savings and other stakeholders.

Arguably, there could be enough overlap between administrative and performative metrics to merge them into one category. The value in collecting a variety of different types of metrics is that it helps planning efforts as a cost center given the reality of finite resourcing. As noted, the method of reporting metrics should reflect the unique operational environment of an organization, so we encourage considering these other methods of examining metrics.

By Audience or Stakeholder

When designing metrics, consider the intended audience and the outcomes these metrics aim to support. A given metric may serve various purposes from helping CTI managers justify resource needs to highlighting areas of excellence or identifying routine concerns. Most often, the primary audience for metrics is senior leadership, but depending on industry or region, this may be partially driven by regulatory compliance or support headcount scrutiny or even audit.

To ensure relevance and impact, the metrics chosen for each audience should be directly aligned with key business outcomes. While tailored to different audiences, these metrics ultimately contribute to a unified understanding of how performance impacts business success.

These measures may then be used as fodder for a CTI manager to develop a business justification for additional headcount, highlight an area of excellence, or identify routine areas of concern to consumers. In cases where metrics are tied directly to intelligence requirements, capturing the stakeholders, desired impact, and realized impact in support of stakeholder outcomes is prudent. Common implementations of this metric type are often limited to consumer feedback along the criteria of timeliness, completeness, and actionability–both immediate action and to inform strategic planning.

Examples

  • The CTI team may keep an internal count or percentage of documented consumer workflows as an early metric for awareness and integration. They could also monitor the number of stored and reviewed Priority Intelligence Requirements (PIRs), reflecting a focus on CTI-owned processes.
  • The CTI team may leverage ticketing platforms to track the volume of opened and resolved tickets, the frequency of direct requests, the teams driving ticket creation, the support provided, and the types of work required. This reflects a middle ground of both CTI-owned processes and the RFI processes of peer teams.
  • CTI may track the rate at which threat actor information is delivered proactively versus reactively to demonstrate increasingly forward-looking analysis. The team may also know where and how to view consumer outcomes and metrics to determine how CTI impacted these operations. This reflects a fully integrated data collection and presentation method.

By Organizational Reach

Effective CTI programs leverage regular cadence syncs and consumer group integration to raise stakeholder awareness, grow the brand, expand organizational reach, and ensure cross-functional cooperation. Understanding the workflows of stakeholder teams is vital to demonstrating CTI’s impact. Metrics can measure the frequency and timing of CTI interactions, correlate team utilization via requests for information (RFIs), and measure feedback loops, collaborations, and brand advocacy across the organization.

Early-stage CTI teams may operate with minimal integration, placing a heavy burden on team members to educate consumers about CTI’s role. As integration improves, the goal is to move towards seamless alignment. Teams looking to build metrics based on organizational reach can benefit from lessons learned by industry peers with similar team size, composition, or constraints.

CTI leadership can demonstrate incremental improvement in trust with natural stakeholders, especially where progress was stymied. This can be achieved through breaking down organizational silos, overcoming difficult personalities, and creating opportunities for joint work production, training, or cross-team exposure. Additionally, cybersecurity senior management can explicitly recognize improved intra-team collaboration.

Examples:

  • The risk management team may proactively reach out to CTI leadership or team members to collaborate on short- or long-term assessments, seeking CTI as an input into their overall risk assessment. Subsequent outbriefs to cybersecurity and risk leadership should include representatives from both teams.
  • The red team manager may want to know how often CTI supports their engagements to properly simulate realistic threats to the organization and how to improve internal processes for both teams to improve quality.

By Complexity

Metrics vary in complexity based on access to data and need to engage partner teams. Low-complexity metrics, for instance, are defined as those that operate within the control of the CTI team and can include tracking the number of phishing emails reported by employees. Whereas high-complexity metrics rely on cross-team data, processes, and collaboration, which require accounting for additional administrative overhead.

High-complexity tasks–and their subsequent metrics capture–may involve significant collaboration, implicit assumptions, and inadvertent bias, leading to potential cascading errors. This underscores the need for the CTI program lead to remain vigilant during the metric creation and capture process. Balancing effort across different metric types is essential to ensure that the CTI team is not overburdened by overly complex metrics, while still capturing valuable insights that require cross-team collaboration. This balance allows for efficient resource allocation and maximizes the impact of the CTI program.

Examples:

  • The CTI team’s count of data sources used during the production of intelligence over a given period of time may be wholly dependent on the CTI team’s operations and could be considered a low-complexity metric.
  • The CTI team performing regular spot checks on whether intelligence produced adhered to internal quality standards, provided the proper substantive depth, and was written through the lens of business operations support would be between a low- to moderately-complex task due to resource commitment requirement.
  • A metric built to identify cost savings through risks reduced, faster adversary discovery, strong detections written, and other use cases broken out per product produced across stakeholders would be considered high-complexity. This requires a clear understanding of consumer workflows, collaboration, and agreement over outcomes.

By Point-in-Time or Period of Time

Metrics may provide value either as point-in-time snapshots or as longitudinal data over extended periods. While extended time frames help identify trends and outliers, it becomes challenging to attribute outcomes to specific causes. Therefore, it’s critical to document factors that influence metric interpretation.

Note: structured data stored in a logical manner with proper tagging can be easily queried in spreadsheets or more powerful central intelligence systems like The Vertex Project’s Synapse to quickly create recurring reports and queries to view longitudinal trends. We provide the following two illustrations as examples of ways to store intrusion-related data structured in a way that is easy to query.

Figure 2: Point in Time Intrusion Artifacts Per Kill Chain Stage Aligned to the Diamond Model
Figure 2: Point in Time Intrusion Artifacts Per Kill Chain Stage Aligned to the Diamond Model
Figure 3: Longitudinal Representation of Intrusions to Derive Trends
Figure 3: Longitudinal Representation of Intrusions to Derive Trends

Examples:

  • If the team tracks volume of output and half of the production team was out for a given month, this note should be included for as long as that month remains relevant to data presentation, not just as a footnote in a middle manager’s head. Displaying this type of information as a percentage or ratio should also apply statistical best practices to allow for proper data interpretation.
  • Production or impact over a given month or year, change from year-over-year for exploration into reasoning that could explain the shift and whether it is reflective of a new baseline, such as a strong performing team member leaving the CTI program with a lapse in coverage until resources are addressed.
  • Shifts how CTI spends its time in consumer support to illustrate exploration into innovation and justify expanding remit and engineering development efforts or tool procurement.
  • Growth and decline in reliance on individual sources to pinpoint analytic dependencies, prompt revisiting the collection management plan, and evaluation of available data sources with their respective potential value add proposition.

Getting Started with Metrics: An Incremental Approach

CTI programs that are starting to build metrics should be intentional about their creation, outlining clearly the purpose behind each. Ultimately, metrics are a means to an end. Period. Hard stop. As programs mature, they can develop metrics that are more strategic and complex, designed to unearth trends. However, experimentation is usually required in order to right-size these aspirational metrics.

There is a deeper discussion needed, however, about what to capture and when, as the CTI program’s demand, capacity, and capability grow. Once the specific threats, vulnerabilities, PIRs or stakeholder demands are clarified and documented, only then do you have a tangible CTI program to start with. Any metrics developed before this baseline level of maturity will fall victim to high noise, shifting contexts, and exceedingly fluid business processes.

As Rob Lee and Rebekah Brown emphasize in the SANS FOR578 course, the core metric for a CTI team is whether it meets stakeholder needs and demonstrates business impact. CTI teams should aim to provide straightforward answers to common program questions and establish a tangible program baseline, capturing specific threats, vulnerabilities, and stakeholder needs. As a customer service function, this is imperative to justify the CTI program’s existence.

The initial metrics a program creates should focus on data that is easily obtainable, minimally complex, and easy to interpret. As CTI teams develop familiarity with data collection, they can evolve through the taxonomy, capturing more nuanced and sophisticated metrics. Continuous improvement, supported by structured, repeatable data, is crucial for metrics-driven maturity.

Example:

  • Start with year-over-year trend analysis to establish baselines. This gives stakeholders a clear view of how security posture evolves over time and helps inform strategic decisions.

Every metric created should be evaluated for what it might imply to an uninformed consumer, so that the CTI leadership practitioner can provide staff adequate onboarding to address pitfalls and errors analysts may encounter during routine delivery execution. The easiest way to do this is to consider implications and inferences around causality, assumptions, and gaps. This becomes critical when conveying metrics over time, as people may erroneously fill in knowledge gaps that are unsupported by evidence.

Examples:

  • The volume of CTI products may dip over a given period, but if the depth of reporting has increased, it would be an error to assume the CTI team is less productive. Such an assumption represents an unsupported conclusion about causality.
  • Metrics built around risk reduction will almost certainly require assumptions about value. For example, successfully mitigating damage to brand reputation is unlikely to yield a quantifiable figure. CTI practitioners must clearly communicate assumptions of this value proposition.

Building Stakeholder Engagement with Metrics

Metrics not only convey performance but also serve as tools to secure buy-in from critical stakeholders. Effective metrics enable CTI programs to demonstrate responsible investment in security resources, communicate the rationale behind metric selection, and reveal insights that support data-driven decision-making. By engaging stakeholders in this process, CTI teams increase the likelihood of program support and build trust in the data used to inform cybersecurity strategies.

Growing a CTI program involves more than tracking metrics; it requires actionable insights that drive program alignment with organizational goals. This alignment covers practical elements such as processes, deliverables, and integrations that allow for consistent measurement.

Example:

  • If the organization uses JIRA, CTI teams can create deliverable metrics using JIRA’s built-in solutions, establishing a cost-effective program dashboard that tracks CTI engagement.

To end this, below is a visual representation of examples that have proven effective in supporting teams while also contributing to the organization’s internal maturity journey. This example table serves as a practical reference, illustrating various CTI metrics by role, audience, complexity, and timeframe.

Table 1: Sample CTI Metrics Table

Metric Type

Example

Role

Audience

Complexity

Time Frame

Report Utility

Share of reports using licensed data sources

Administrative

Senior Management

Low

Point-in-Time

Resource Allocation

CTI support to red team engagements

Performative

Red Team

Medium

Period of Time

Threat Reduction Impact

Measured decrease in identified risks

Operational

Risk Management

High

Period of Time

Consumer Feedback Rate

Frequency of RFI submissions

Integration

Consumer Teams

Low

Period of Time

Cost-Benefit Analysis

Estimated savings from CTI-led mitigations

Operational

Finance

High

Point-in-Time

Closing Thoughts

CTI teams will continue to be asked to provide metrics to demonstrate their value to leadership, regardless of whether the request aligns with its intended purpose. Any CTI professional should approach such requests with as little ambiguity as possible, seeking clarification on the desired outcomes and leadership’s appetite to allocate additional resources should the intended goals exceed existing capabilities or capacity.

We would be remiss if we did not highlight existing CTI metrics resources:

  • Gert-Jan’s Master CTI Metrics Matrix
  • Marika Chauvin and Toni Gidwani's 2019 SANS CTI Summit talk, How to Get Promoted: Developing Metrics to Show How Threat Intel Works
  • Metrics are the Drivers of CTI Value
  • Freddy Murre's 2022 FIRST CTI Symposium presentation, Vanity Metrics - The BS of Cybersecurity

We stewed on this quite a bit and hopefully the insights provided in this blog offer a solid starting point to categorically approach metrics generation. Feel free to follow us on social media for more content on this and other CTI topics:

  • John Doyle
  • Gert-Jan Bruggink
  • Steven Savoldelli
  • Callie Guenther

Special thanks to Freddy Murre and Nicole Hoffman for their thoughtful peer review, questions posed, and substantive suggestions that improved the blog’s quality. And many thanks also to Koen Van Impe who created a most excellent graphic to quickly help summarize the main categories presented within this article.

Share:
TwitterLinkedInFacebook
Copy url Url was copied to clipboard
Subscribe to SANS Newsletters
Receive curated news, vulnerabilities, & security awareness tips
United States
Canada
United Kingdom
Spain
Belgium
Denmark
Norway
Netherlands
Australia
India
Japan
Singapore
Afghanistan
Aland Islands
Albania
Algeria
American Samoa
Andorra
Angola
Anguilla
Antarctica
Antigua and Barbuda
Argentina
Armenia
Aruba
Austria
Azerbaijan
Bahamas
Bahrain
Bangladesh
Barbados
Belarus
Belize
Benin
Bermuda
Bhutan
Bolivia
Bonaire, Sint Eustatius, and Saba
Bosnia And Herzegovina
Botswana
Bouvet Island
Brazil
British Indian Ocean Territory
Brunei Darussalam
Bulgaria
Burkina Faso
Burundi
Cambodia
Cameroon
Cape Verde
Cayman Islands
Central African Republic
Chad
Chile
China
Christmas Island
Cocos (Keeling) Islands
Colombia
Comoros
Cook Islands
Costa Rica
Cote D'ivoire
Croatia (Local Name: Hrvatska)
Curacao
Cyprus
Czech Republic
Democratic Republic of the Congo
Djibouti
Dominica
Dominican Republic
East Timor
Ecuador
Egypt
El Salvador
Equatorial Guinea
Eritrea
Estonia
Eswatini
Ethiopia
Falkland Islands (Malvinas)
Faroe Islands
Fiji
Finland
France
French Guiana
French Polynesia
French Southern Territories
Gabon
Gambia
Georgia
Germany
Ghana
Gibraltar
Greece
Greenland
Grenada
Guadeloupe
Guam
Guatemala
Guernsey
Guinea
Guinea-Bissau
Guyana
Haiti
Heard And McDonald Islands
Honduras
Hong Kong
Hungary
Iceland
Indonesia
Iraq
Ireland
Isle of Man
Israel
Italy
Jamaica
Jersey
Jordan
Kazakhstan
Kenya
Kiribati
Korea, Republic Of
Kosovo
Kuwait
Kyrgyzstan
Lao People's Democratic Republic
Latvia
Lebanon
Lesotho
Liberia
Liechtenstein
Lithuania
Luxembourg
Macau
Madagascar
Malawi
Malaysia
Maldives
Mali
Malta
Marshall Islands
Martinique
Mauritania
Mauritius
Mayotte
Mexico
Micronesia, Federated States Of
Moldova, Republic Of
Monaco
Mongolia
Montenegro
Montserrat
Morocco
Mozambique
Myanmar
Namibia
Nauru
Nepal
Netherlands Antilles
New Caledonia
New Zealand
Nicaragua
Niger
Nigeria
Niue
Norfolk Island
North Macedonia
Northern Mariana Islands
Oman
Pakistan
Palau
Palestine
Panama
Papua New Guinea
Paraguay
Peru
Philippines
Pitcairn
Poland
Portugal
Puerto Rico
Qatar
Reunion
Romania
Russian Federation
Rwanda
Saint Bartholemy
Saint Kitts And Nevis
Saint Lucia
Saint Martin
Saint Vincent And The Grenadines
Samoa
San Marino
Sao Tome And Principe
Saudi Arabia
Senegal
Serbia
Seychelles
Sierra Leone
Sint Maarten
Slovakia
Slovenia
Solomon Islands
South Africa
South Georgia and the South Sandwich Islands
South Sudan
Sri Lanka
St. Helena
St. Pierre And Miquelon
Suriname
Svalbard And Jan Mayen Islands
Sweden
Switzerland
Taiwan
Tajikistan
Tanzania, United Republic Of
Thailand
Togo
Tokelau
Tonga
Trinidad And Tobago
Tunisia
Turkey
Turkmenistan
Turks And Caicos Islands
Tuvalu
Uganda
Ukraine
United Arab Emirates
United States Minor Outlying Islands
Uruguay
Uzbekistan
Vanuatu
Vatican City State
Venezuela
Vietnam
Virgin Islands (British)
Virgin Islands (U.S.)
Wallis And Futuna Islands
Western Sahara
Yemen
Zambia
Zimbabwe

By providing this information, you agree to the processing of your personal data by SANS as described in our Privacy Policy.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Tags:
  • Digital Forensics, Incident Response & Threat Hunting

Related Content

Blog
DFIR - Blog - Running EZ Tools Natively on Linux_340 x 340.jpg
Digital Forensics, Incident Response & Threat Hunting
April 23, 2025
Running EZ Tools Natively on Linux: A Step-by-Step Guide
Developed by Eric Zimmerman, the EZ Tools suite is a collection of utilities written to assist with multiple aspects of forensic analysis.
Seth_Enoka_370x370.png
Seth Enoka
read more
Blog
DFIR - Blog - Are Ransomware Victims Paying Less_340 x 340.jpg
Digital Forensics, Incident Response & Threat Hunting
April 11, 2025
Are Ransomware Victims Paying Less? Insights from the Latest Stay Ahead of Ransomware Live Stream
In this month's reboot of the SANS Stay Ahead of Ransomware live stream, we dove into one of the most pressing questions in cyber extortion today.
Mari DeGrazia
Mari DeGrazia
read more
Blog
powershell_option_340x340.jpg
Cyber Defense, Digital Forensics, Incident Response & Threat Hunting, Cybersecurity and IT Essentials, Offensive Operations, Pen Testing, and Red Teaming
July 12, 2022
Month of PowerShell - Windows File Server Enumeration
In this Month of PowerShell article we look at several commands to interrogate Windows SMB servers as part of our incident response toolkit.
Josh Wright - Headshot - 370x370 2025.jpg
Joshua Wright
read more
  • Company
  • Mission
  • Instructors
  • About
  • FAQ
  • Press
  • Contact Us
  • Careers
  • Policies
  • Training Programs
  • Work Study
  • Academies & Scholarships
  • Public Sector Partnerships
  • Law Enforcement
  • SkillsFuture Singapore
  • Degree Programs
  • Get Involved
  • Join the Community
  • Become an Instructor
  • Become a Sponsor
  • Speak at a Summit
  • Join the CISO Network
  • Award Programs
  • Partner Portal
Subscribe to SANS Newsletters
Receive curated news, vulnerabilities, & security awareness tips
United States
Canada
United Kingdom
Spain
Belgium
Denmark
Norway
Netherlands
Australia
India
Japan
Singapore
Afghanistan
Aland Islands
Albania
Algeria
American Samoa
Andorra
Angola
Anguilla
Antarctica
Antigua and Barbuda
Argentina
Armenia
Aruba
Austria
Azerbaijan
Bahamas
Bahrain
Bangladesh
Barbados
Belarus
Belize
Benin
Bermuda
Bhutan
Bolivia
Bonaire, Sint Eustatius, and Saba
Bosnia And Herzegovina
Botswana
Bouvet Island
Brazil
British Indian Ocean Territory
Brunei Darussalam
Bulgaria
Burkina Faso
Burundi
Cambodia
Cameroon
Cape Verde
Cayman Islands
Central African Republic
Chad
Chile
China
Christmas Island
Cocos (Keeling) Islands
Colombia
Comoros
Cook Islands
Costa Rica
Cote D'ivoire
Croatia (Local Name: Hrvatska)
Curacao
Cyprus
Czech Republic
Democratic Republic of the Congo
Djibouti
Dominica
Dominican Republic
East Timor
Ecuador
Egypt
El Salvador
Equatorial Guinea
Eritrea
Estonia
Eswatini
Ethiopia
Falkland Islands (Malvinas)
Faroe Islands
Fiji
Finland
France
French Guiana
French Polynesia
French Southern Territories
Gabon
Gambia
Georgia
Germany
Ghana
Gibraltar
Greece
Greenland
Grenada
Guadeloupe
Guam
Guatemala
Guernsey
Guinea
Guinea-Bissau
Guyana
Haiti
Heard And McDonald Islands
Honduras
Hong Kong
Hungary
Iceland
Indonesia
Iraq
Ireland
Isle of Man
Israel
Italy
Jamaica
Jersey
Jordan
Kazakhstan
Kenya
Kiribati
Korea, Republic Of
Kosovo
Kuwait
Kyrgyzstan
Lao People's Democratic Republic
Latvia
Lebanon
Lesotho
Liberia
Liechtenstein
Lithuania
Luxembourg
Macau
Madagascar
Malawi
Malaysia
Maldives
Mali
Malta
Marshall Islands
Martinique
Mauritania
Mauritius
Mayotte
Mexico
Micronesia, Federated States Of
Moldova, Republic Of
Monaco
Mongolia
Montenegro
Montserrat
Morocco
Mozambique
Myanmar
Namibia
Nauru
Nepal
Netherlands Antilles
New Caledonia
New Zealand
Nicaragua
Niger
Nigeria
Niue
Norfolk Island
North Macedonia
Northern Mariana Islands
Oman
Pakistan
Palau
Palestine
Panama
Papua New Guinea
Paraguay
Peru
Philippines
Pitcairn
Poland
Portugal
Puerto Rico
Qatar
Reunion
Romania
Russian Federation
Rwanda
Saint Bartholemy
Saint Kitts And Nevis
Saint Lucia
Saint Martin
Saint Vincent And The Grenadines
Samoa
San Marino
Sao Tome And Principe
Saudi Arabia
Senegal
Serbia
Seychelles
Sierra Leone
Sint Maarten
Slovakia
Slovenia
Solomon Islands
South Africa
South Georgia and the South Sandwich Islands
South Sudan
Sri Lanka
St. Helena
St. Pierre And Miquelon
Suriname
Svalbard And Jan Mayen Islands
Sweden
Switzerland
Taiwan
Tajikistan
Tanzania, United Republic Of
Thailand
Togo
Tokelau
Tonga
Trinidad And Tobago
Tunisia
Turkey
Turkmenistan
Turks And Caicos Islands
Tuvalu
Uganda
Ukraine
United Arab Emirates
United States Minor Outlying Islands
Uruguay
Uzbekistan
Vanuatu
Vatican City State
Venezuela
Vietnam
Virgin Islands (British)
Virgin Islands (U.S.)
Wallis And Futuna Islands
Western Sahara
Yemen
Zambia
Zimbabwe

By providing this information, you agree to the processing of your personal data by SANS as described in our Privacy Policy.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
  • Privacy Policy
  • Terms and Conditions
  • Do Not Sell/Share My Personal Information
  • Contact
  • Careers
© 2025 The Escal Institute of Advanced Technologies, Inc. d/b/a SANS Institute. Our Terms and Conditions detail our trademark and copyright rights. Any unauthorized use is expressly prohibited.
  • Twitter
  • Facebook
  • Youtube
  • LinkedIn