Home Cybersecurity Update on Cyberspace Situational Awareness Research – 1Q2017

Update on Cyberspace Situational Awareness Research – 1Q2017

Graph Processing Status Message
Graph Processing Status Message

Here is a quick 1Q2017 update on my 2017 cyberspace situational awareness (CSA) research projects (see EOY 2016 status update here):

(1) Completed initial development of cyberspace visualization application (faster than I expected). Visualization was developed using C# and exceeded my expectations.

(2) Generalized my visualization engine so it would work easily with a variety of sensors. The general high-level architecture evolved to this:

a. Store sensor data in a relationship database where each row in the database represents a unique originating IP address.

b. Score the sensor data above based on a custom risk-scoring algorithm I developed. Prune database (when the data grows very large) based on risk (and time since the object was last seen), deleting low-risk, no-risk objects from the database.

c. Write entire database as a JSON file, transport serialized objects across the Internet and visualize.

(3) Validated the multi-sensor data fusion (MSDF) approach for cyberspace security and cyberspace situational awareness:

a. Local object-bases from MSDF maps directly to relational databases. Graph databases, currently trendy, are not required for MSDF approaches to cyberspace SA.

b. Data-cleansing is best performed during the INSERT or UPDATE DB operation. Using a well typed DB structure insures a much more efficient data processing operation downstream.

c. The visualization engine should be as generic as practical with a local configuration file which is loaded at runtime. This permits easy changes to the cyberspace visualization layout including various mappings, color schemes, node sizes, etc.

Cyberspace by Tim Bass
A Small View of Cyberspace by Tim Bass

(4) With the visualization engine basically “done”, I made the design decision to focus on a risk-driven blackboard architecture design process:

a. Local sensor data is risk-scored and only data which has exceeded a predefined risk threshold is placed on the blackboard (into the blackboard database).

b. Multi-sensor data fusion occurs on the blackboard database. Object refinement is performed on the local sensor databases.

c. The blackboard controller performs the bare-bones functions needed to INSERT and UPDATE records on the blackboard.

d. Various data-fusion algorithms can be processed on the blackboard data for situation refinement.

(5) In a nutshell, I have proven, at least to my own satisfaction, that the MSDF approach to cybersecurity is a viable approach for large sets of distributed sensor data. I have created stunning 3D visualizations using various graph layout algorithms. The visualizations are useful; but not the end-all, be-all solution, because visualization is only one key part of the MSDF / Blackboard architectural design concept.

(6) I have started to experience frustration when I read the myriad self-promoting marketing posts on social media such as LinkedIn; so I’ve decided to “give up” on any hope that I can find anything useful in social media regarding this topic. Social media, it seems, is much closer to being “the problem” than “the cure”.

(7) Ahead of schedule, after a short break I plan to create a working prototype of a “new” cybersecurity blackboard architecture (risk-driven) with numerous distributed sensors and sensor object-bases as Blackboard knowledge sources.