RSS

Category Archives: Information Technology

Invited to Present at the UIS Cyber Defense and Disaster Recovery Conference

CCDR

It’s been a while since I’ve posted anything of substance … being buried in transitioning a new 24-person software development contract, academic papers for the PhD program, and bid & proposal efforts, it’s likely to be a bit yet.  Even so, I thought it worth posting that I’ve been invited to present at the 2014 Cyber Defense and Disaster Recovery Conference (CDDR) at the University of Illinois Springfield (UIS) on Thursday, March 13.  There should be a mid sized (~225+) paid audience of practitioners, business people & academics.  The event is sponsored by InfraGard Springfield with coordination and support by the FBI and UIS.

For more information on the conference, take a look at their archives – they’ve had a great list of presentations and speakers; and I’d certainly be privileged to be among them.

The theme I’ve been asked to present is on security incident and response as firms migrate to distributed storage technologies, and I just about proposed a presentation title of The bits be everywhere – keeping them in their tubes & cleaning up the mess when they spring a leak.  I did come to my senses, and the actual topic is TBA after I’ve confirmed my availability, but the general subject remains the same & is something I’ve written around the edges of on this blog in the last year.  The working title (at least for the moment) is a bit more professional, and is something on the order of Minimally intrusive governance & distributed storage systems: Considerations for disaster recovery and contingency planning in a mobile world.

Knowing that there will be small business leaders in attendance, and having been asked to make the presentation instructional; I’m tempted to fall back to the broader areas of governance, compliance & risk.  When considering the ways that varied attendees might prepare for security and incident response, and the answer being “it depends”; I think a broader perspective of the criticality of good governance and orchestrated process of BCP/DRP specific to distributed data storage should be appropriate.  If there’s one thing I know, it’s that awareness of security is insufficient, as is the presentation of a solid business case.  The competing priorities of security and workflow efficiency must be addressed or people will work around the controls.  Though a recent area of study, Albrechsten (2007) and Takemura (2011) both provide very good evidence of this, with an identified need to blend not only awareness, but the practical actions that can be embedded into process, without significant impact to the overall efficiency of operations.

Viewed from the broader perspective, these are not easy challenges to solve; and unfortunately the problem is less frequently technological in nature, and more often is tied to the behavior of the organization itself.   Addressing change in technology, workflow, and culture (regardless of reason) require a more deeply rooted desire to change behavior patterns from those that must implement them. In a fashion similar to the myriad theories and models of change management and/or organizational behavior, it’s the individuals within the group that have to be effectively targeted.  Throw in a twist of technological adoption and the typical fear, uncertainty, and doubt (FUD) normally used in areas of security & that are often seen as “keeping people from doing their jobs” … you’ve got the perfect storm of things that are tough to change. Focusing then, on the intersecting issues of storage management, IA, and the optimization of information security investments within a framework of process re-engineering & adoption strategies borrowed from the TAM (Venkatesh, 2000, 2003, 2008; Morris, Davis, G., & Davis, F., 2003); I’ll be pulling from other research and courses I’ve developed to combine as a one-hour session.

At the very least, it did force me to take a look at my short bio … which I haven’t done in far too long.  While not fantastic, I think it still gets the point across & I’ve attached it here for my own entertainment. I’ll post more detail as this gets flushed out over the coming weeks, and as always I welcome any comments or input you have.

Short Bio – Levii Smith

//Levii

References

Albrechtsen, E. (2007). A qualitative study of users’ view on information security. Computers & Security, 26(4), 276–289. doi:10.1016/j.cose.2006.11.004

Takemura, T. (2011). Statistical Analysis on Relation between Workers’ Information Security Awareness and the Behaviors in Japan. Journal of Management Policy and Practice, 12(3), 27–37.

Venkatesh, V., Morris, M., Davis, G., & Davis, F. (2003). User Acceptance of Information Technology: Toward a Unified Veiw. MIS Quarterly, 27(3), 425–478.

 

 

Application for Microsoft Academic Search App ID

Contact Person: Levii Smith
Contact Email: smithlev@levii.com 

The abundance of academic publications coupled with the highly specific nature of each publication makes the process of research extremely tedious. Existing academic search engines generally rank results by relevance to query keywords. Furthermore, the results are returned as a long list of links, which is easy to skim but difficult to actually use. These characteristics (based on web search) are not ideal for the domain of academic publications – when doing research, we want to know what the important papers are as well as the relevant ones, and also easily navigate between papers while remaining aware of the context of the search.

One useful measure of importance is number of citations (used by e.g. MS Academic Search and Google Scholar). We propose using this reference information to visualize the local neighborhood of a paper within the larger graph of academic publications. This is similar to the MAS citation graph, but would present both forward and backward references as well as make the vital features of importance and relevance more obvious to the user. This would facilitate rapid discovery of important information in fields unfamiliar to the user.

I have read and agreed with the Microsoft/MAS Terms of Use.
Microsoft Terms of Use
Microsoft Academic Search Terms of Use

Project URL: http://levii.com/academic-search-and-reference-visualization

 

Tags: , , ,

Safely Moving Sensitive Data Outside of the Organization: Using Outlook to Encrypt to External Entities

Sensitive information, whether: PKIMail-Simplified

  • FOUO documents being handled for the government
  • PII during hiring and staffing
  • Proprietary or confidential company information
  • Competition sensitive intelligence
  • Or any other protection required data (PRD)

Require special use and handling of digital communications.  Unfortunately, a lack of education and familiarity with available methods leads to (1) data either being exchanged using insecure means, or (2) a breakdown in communication.

Sharing this information without protection has significant implications under both criminal and civil law, as I’ve discussed before relating to issues of negligent entrustment.  Even so, it is quite often an area of Information Assurance (IA) delegated to the lowest levels to implement and follow from a policy perspective.  Given that breach could be a public-relations nightmare, and that statutory liability could be financially destructive to a firm; it’s essential that data is protected at rest, and while in transit.

Truthfully, among the methods available for dealing with risk (i.e. insurance, transference, mitigation, avoidance), the simplest in this regard is avoidance.  “Just don’t send PRD”.  However, a knee-jerk reaction not share sensitive data can be problematic to the operation of many organizational programs, and an intentional breakdown of the ability to engage in transparent operation is not only inefficient, but can signal larger communication issues likely to exist.

Too frequently, I’ve encountered the assumption that encrypted mail cannot be exchanged between unrelated organizations, though it really is a simple process that seems extraordinary. I think this is due in part to the lack of education we provide as part of corporate/government/etc. cyber-awareness programs, but is certainly due to the fact that the first step in the process (the public key exchange) is typically automated and transparent.  Since the advent of Kerberos, however, virtually every directory structure with a modern authentication uses some form of certificate authority (CA), and having neither PKI certificates available within an organization, or certificates signed by an external CA (eCA) borders on negligence.

This paper provides a short guide to address this gap, and assumes one of the most common scenarios within the enterprise; the use of a public key infrastructure and Microsoft Outlook. Though certainly relevant to contractors and the DoD, it remains applicable anywhere that meets these two criteria.

View Fullscreen
 

Extending Requirements Management for Business Planning

I’ve used SysML to model requirements and traceability for impact analysis, coverage, etc. for a fair amount of time; and across more domains than the typical use of CASE tools.  I’ve found them to be useful as methods to model governance and compliance requirements from regulatory, contract, and corporate policy, and have even used requirements models to model my own priorities as an easy load into traditional project management software (e.g. MS Project).  To date my applications of choice have always been based on the Eclipse Requirements Modeling Framework (RMF) as incorporated in the TopCased Model Based Systems Engineering (MBSE) tool-chain.  This project is now being migrated to PolarSys, though the requirements part appears to have been incorporated as a standalone component by ProR.

In traditional Microsoft fashion, a good set of tools based on Collaborative Systems Requirements Modelling Language (CSRML), A goal-oriented markup for requirements modelling, has recently come to my attention.

If there’s anything that you’d need a formal requirements model for, but without the overhead of a full MBSE approach – it’s certainly worth looking into.  Questions regarding the point of modeling non-engineering activities are relatively predictable, and I’d like to address a couple of the more common asked by those represented in the Venn diagram below.

Actors in Enterprise Compliance

 

“What’s wrong with traditional BPM, CASE, RM, or other IDE type of tools?”

Unfortunately, Papyrus, TopCased, and many other similar applications are (a) less than user friendly for those without a background and understanding in UML (b) limited in scope of representation and access for collaboration.  From that perspective I wanted to think of alternative approaches, and the explosion in collaborative IDEs for software development leaves me hoping for a similar capability for modeling.

There are certainly no shortage of collaborative applications, and I’ll follow with some other shortcomings in the current SaaS market in a separate post. A prime example, however, of a promising entry are the IDEs that can provide tangible value to teams such as Cloud9 (use Chrome or FF).  This SaaS modeled IDE provides real-time visibility between members working on a project and is an incredibly efficient tool I use to work with remote developers when performing interviews & skills assessments.  While I’m of a firm belief that nothing quite replaces a face-to-face interview, being able to work through a real-world project allows me insight into the thought processes, and fit into the team dynamic they are being interviewed for … saving on travel costs, or the expense of a poor hiring decision.  Alternately something as simple as providing training, guidance, and leadership support to a remote employee who needs assistance with a concept is facilitated through such tools. Of course we’ve got many options for this, from simple screen-sharing and chat applications to full office suites such as: Office365, Google Docs, or Office 2010 (when backed by SharePoint) which all support real-time, multi-user, collaborative editors.  Extending the IDE as an asset into this domain solves two intermediate problems to having our requirements & models generate value for the organization.

  1. Since, enterprise architecture (EA) is often seen by personnel to be an esoteric representation of the organization and its processes.  Without open-access and visibility, ease of manipulation & methods of sharing EA data … it often ends up gathering dust in the corner, instead of being used to drive change. By enabling a collaborative and data-driven view of the enterprise (whether it be business process or IT architectures), the uses of data are opened up to creative possibilities & transparency can assist in the maintenance of information – preserving the original investment in investigation.
  2. Addressing requirements and developing plans for compliance within the enterprise is noble, but is only touching on the underlying need for traceability. With a more collaborative and diverse audience, the ability to trace requirements, their precedence & the source of those while mapping them against each other would only serve to bring a degree of transparency into the areas of Governance, Compliance & Risk (the subject of my next article).

“Why do I need to model my compliance needs? What’s wrong with using ABC?”

Some might accuse me of overcomplicating what is, on the surface, a relatively straightforward activity & question along the lines of “What are our compliance requirements, what policies & processes do we have to cover them, and how do we ensure our due diligence in validating adherence to those policies?”.  These might include topics in subcontractor management, ANSI EVM controls, IA Control requirements for PCI, or any other myriad set of complementary frameworks. In the typical firm, however, the necessary perspective to answer this line of questioning is rarely found in one person, or group, and without collaborative ideation … it’s incredibly likely that either (a) something will be missed or (b) that information isn’t effectively communicated to personnel.  In either instance, a failure of due diligence may leave a firm at risk.

Often I’ll see this documented as spreadsheets with a matrix of tabs, but this fails the traceability test & doesn’t lend itself to automation in case of audit.  Furthermore, it fails to offer extensibility at scale, doesn’t afford the organization an efficient method of communicating the requirements to program/project managers or their staff, dramatically increases overhead required to train and validate program execution, and increases the amount of rework or discovery needed by personnel who have a need of that data.

Blame it on my firm belief in the value of quantitative measurement and management as not only a “best practice”, but as a “required practice” for high-performing teams & organizations; or perhaps, my firm belief in transparent leadership and governance.  Enough people have heard my argument against the use of spreadsheets as a form of enterprise knowledge management, to understand my perspective as to why this is a bad idea & why the way the typical firm handles this today needs something different. That something needs to be easy-to-use for the small to mid-sized firm, needs to be able to tie into ERP and audit systems, and needs to be a collaborative method treating EA & Governance as reusable, transparent, and executable enterprise assets.

Though I say it jokingly to my colleagues, there is truth when I note that I’ll need to consider an MBA after I complete my PhD, and in all likelihood should follow that with a JD … there’s a niche opportunity for systems, and people that have a deep understanding between areas of business, law & technology.

“So what can we do about it?”

While I’m often accused of having big ideas … being a person capable of developing solutions to most of them is a core strength I rely on to demonstrate value through prototypes.  To that end, and based on my research/papers on the overall extent to which regulatory, contract, and corporate governance can cross organizational disciplines; I’ve begun modeling a SaaS solution for publication control mapping and compliance methods.  I’m through the generic model of compliance, and have a number of concepts on how to carry it forward, but am leaning towards a community-driven site and application that could tie to my parallel efforts in using predictive analytics for effort & cost estimation.  As each gets a bit more polished I’ll publish them under an open license; but in the meantime I certainly appreciate any insight, critique, or commentary on the subject.

So jump on in and share your thoughts.  Do you see this as a problem, or is it a niche need? How do you manage the myriad set of policies, procedures, and alignment to the governance frameworks in your industry? Any particular tools that you use, and how do you ensure that those processes are executable by staff? Do you have an orchestrated set of executable workflows, or is it tribal knowledge and lots of training (for hopefully compliant programs)?

 
Leave a comment

Posted by on January 5, 2014 in Business, Information Technology

 

Tags: , , , , , , , , ,

A Holiday Break

With the holiday season upon us, I’ll be taking a break from writing original posts and/or papers for a couple of weeks.  I won’t be taking this time to rest on my laurels, however.  I have plenty of projects to complete around the house, a decade of photos to sort and upload, and a number of older works that need dusted off and posted so that there is a possibility of others getting value from them.

Happy Holidays

I’ve continued the love of open collaboration, and strive to share my knowledge and expertise freely to those who ask of it. As every new cohort began, I opened with the same statement; which I still hold to be a positive truth:

Though I may be billed as your teacher, trainer, or as the course instructor; none of these are necessarily true. As a room of professionals, each of you have chosen to be here. I cannot teach those that don’t want to learn, nor can I instruct those that don’t wish to listen. I can guide, assist, advise, listen, and mentor. I will freely answer any question to which I’m familiar and certain, and research answers to those I am not. I’m a resource, coach, and your collaborator; and I look forward to the next 40 days we’ll spend here together.

It’s in this spirit that I’ve selected the training decks from the CNAP program to post over the next couple of weeks. The first series to follow this post will be the Cisco Networking Academy Program (CNAP) training material, slides, labs, and other handouts that I prepared as an instructor between 2000 and 2005.  They’ve been touched on from time-to-time, and although no longer relevant for the CCNA exam; the underlying technical concepts and theory aren’t the type of things that will age into irrelevance in the very near future.

In my opinion, the development, mentorship, and transparency in thought-processes are among the most defining characteristics of good leadership. Though I have largely moved out of the classroom, the lessons learned and training received in instructional design, pedagogy, and confidence in my subject-authority will undoubtedly stick with me; and be valuable in all aspects of my life. I hope any good information I might impart while republishing this series may be as valuable to you as it was to me.

Happy holidays to all, and I  look forward to our continued conversation.

//Levii

 

Easily Remember Passwords with Keyphrases

After a week away from the office over Thanksgiving, a colleague had understandably forgotten their convoluted non-dictionary, non-repeating, non-sequential, non-patterned, mixed case, alphanumeric & special, 15 character password. Give it a week or two and who wouldn’t?

As they were trying to think of a memorable password that was strong enough password to meet policy, I thought of an article I’d published circa 2006, and thought it was worth putting back out. It’s hard to believe it was so long ago, but that aside, it’s not a new subject and was the focus of one of my favorite XKCD comics, on slashdot in a discussion of alternate password systems, and over at Scheier on Security as a perspective on the secret question.

Keyphrase Chart Though I initially visited the subject before the ubiquity of smart devices, when I was more involved in the day-to-day system and network administration of organizations, it still seemed relevant as I was considering it today. As many people aren’t using systems like KeePass, can’t carry personally owned internet devices into their workplace, or simply aren’t comfortable with truly random passwords, this article represents the same approach, redone in a different code base simply to demonstrate the concept. Truthfully though, it fairly simple and reflects a mix of two of the oldest methods of concealing messages: the substitution and the book ciphers. The idea was, and remains, to allow users a straightforward tool to use short phrases that are easily memorable, and to use those as a key for more complex passwords, and to make cycling passwords as simple as grabbing a new wallet-sized keyphrase card. In the original incarnation, I printed two (2) of each card so people could keep a copy in a safe location, and carry the other with them. This allows strong passwords on the system, while keeping it easy to remember.

The approach is a relatively straightforward substitution cipher benefited by a randomized card that’s unique to each user. Carry the card (and effectively your passwords) with you, stick it under a keyboard, tape it to your monitor, etc. All you need remember is the simple pass-phrase. The same key can be continuously used, so the word “WORK” becomes an eight letter alphanumeric & special character password, maintaining a reasonable degree of security. Since the actual key is easily remembered, stick cards for different purposes (i.e. work vs. home) at different locations, or keep a backup copy of the card left in an accessible location … without significant risk of password compromise. Furthermore, regular password changes and more likely since the key itself doesn’t need to be regenerated, and there’s less concern about forgetting a bunch of new passwords.

There are certainly changes I’d make to this in a production environment; I’d imagine that plastic cards, having a unique chart printed on the back of a business card, or having a digital version would be an improvement over the stack of printed cards I’d used before. I’m sharing this as I rework some of my whitepapers and other concepts into what is hopefully useful content, and to contextualize them to spark discussion. With the state of existing systems, it’s an idea worth further evaluation; as a potential enhancement, or alternative, to the secret question, the site seals that are growing in popularity, or the keypad entry points that aren’t used near frequently enough. Without threat modeling and algorithmic analysis, the biggest concern I’d focus on would be shoulder-surfing as a means to learn the key, and the fact that all passwords are the same length; which might simplify crypanalysis. Both of which, are however, relatively straightforward changes to the design.

The output is here for anyone interested in the method, and for the sake of completeness the source used to generate the output is posted below. As always I welcome any discussion and feedback on applications, how you manage the overhead of password complexity and human memory, or anything else you might feel like throwing my way.

 

Negligent Entrustment – Revisted Thoughts for Software Development Contractors

Securing Everything as a Service

The Software, the Process, and FISMA Compliance

In the world of XaaS, the internet of things, and particularly in light of recent developments in the cyber-security language of contracts I’ve recently reviewed for capture; I’ve spent some time revisiting the literature and my thoughts surrounding this brief essay I’d written in early 2012 as an argument against the application of negligent entrustment in outsourc(ed|ing) IT.

Due, in part, to the presedential cybersecurity directives  and appearing to be in response to changes in the National Defense Authorization Act, anticipated changes to the DFARS (originating with 2011-D039), and the continued focus on cyber security within all national sectors; a renewed focus on the development lifecycles and standards of programs warrants review and increased level-of-attention that extends beyond QA and EVM.

From a broader perspective, it becomes necessary to fully explore secure development lifecycles (SDL), the role of change management (CM) and the application of supporting standards, frameworks, or models (e.g. ANSI-748 EVM, Agile EVM, Scrum, CMMI, Microsoft’s SDL, etc.) in governing program execution, while facilitating high-performing teams.

Background

At issue, and the bits that got me working back into this subject, are the concepts of warranty when included in contract language and the very real potential of severe ramifications for contractors and consultants alike. Should they be found failing in application of governance and policy the penalty can vary anywhere from a poor contract rating, which impacts future capture, to payment refunds, or the ultimate death penalty of disbarment from award eligibility.

For reference, an example from the US Transportation Command is provided as follows:

The contractor represents and warrants that the software shall be free from all computer viruses, worms, time-outs, time bombs, back doors, disabling devices and other harmful or malicious code intended to or which may damage, disrupt, inconvenience or permit access to the software user's or another's software, hardware, networks, data or information.

For the purposes of this short paper; the goal is to begin a discussion on frameworks supporting effective, secure, and practical development of software that meet all necessary areas of compliance.  Under primary consideration are regulatory, contractual, and corporate governance; with methods (either technological or procedural) of audit, verification, and review that can (or should) be included in the process to ease implementation through incorporation into a standard workflow.  The intent is to share my thoughts on integration and the effectiveness of aligning the multitude of control requirements in ways that don’t negatively impact project execution and velocity.

Discussion

While seemingly straightforward, there are second and third-order considerations that are often overlooked during capture, evaluation, and execution of programs that should require any of us in business, IA, software development, or any other affected field to take pause.

Take, for example, a malicious employee who fears termination. It’s possible (and some might argue likely) that they slip a backdoor into a computer program.  While this certainly represents a criminal act; an entity that fails in applying their own policies or processes, or simply fails to implement effective controls can be held liable – and at great cost.  Similarly, malicious injection on the part of a third party would most-likely be a criminal act; failure to identify the injected source, and to ensure that delivered application code is “free from all …” remains the duty of the contracted party.

Given the increased attention on industry standards (e.g. ISO 27001, ANSI-748, etc.), cyber-security, and other domains of interest; issues of liability arise related to what may initially seem to be otherwise unrelated.  CMMI, for instance wouldn’t appear to be a protection against disbarment any more than effective Agile development supports EVM. Viewed holistically, however, each builds upon the other to develop a new kind of “defense in depth” where the defense isn’t strictly within the realm of cyber, but extends into effective quantitative management of programs, and reasonable review of data to ensure a firm is practicing all necessary due diligence and control of program efforts.

These issues could be considered secondary to those that can be accommodated by best-practices in development and quality assurance (QA) and specified through frameworks and maturity models such as CMMI.  When considering the following, however, they necessarily should be included.

If the Government determines, after a security audit (e.g. ST&E), that software delivered under this task order is non-secure, the Government will provide written notice to the contractor of each non-conformity.

Software shall be "non-secure" under this task order if it contains a programming error listed on the current approved version of the CWE/SANS TOP 25 (which can be located at http ://www.sans.org/top25-programming-errors) or a web application security flaw listed on the current approved version of the OW ASP Top Ten (which can be located at http: //www.owasp.org/index.php/Category:OWASP Top_ Ten_Project).

Given that traceable changes to source (SCM), peer review, and clear documentation are among the most effective methods to validate requirements, prevent malicious injection, and to reconcile findings from static analysis; it’s worth noting that these are key concepts within most software QA and CPI frameworks.  This traceability additionally supports progress-to-plan reporting and quantitative program management.  At the end of the day then, integrating and orchestrating existing best practices from the major domains of governance, serve the goals of information security in this context.

To achieve convergence, information security professionals need a new way of thinking and supporting frameworks and tools that describe a greater role of governance applications.  Akin to the growth of the business analyst’s role, often including systems, requirements, and process engineering subdomains, IA professionals must remain aware of the broader scope of technological and procedural avenues to achieving compliance.  We must additionally understand the industry, applicable regulatory and legal issues, and the goals of the enterprise well enough to support efforts in process re-engineering in ways that can deliver value.

A New Market Advantage?

While this type of compliance, if enforced, could certainly be considered a program risk; I’d prefer to think of it as a strategic opportunity for those that lean forward in developing their capabilities in software and systems audit.  While all companies necessarily innovate, what I suggest is more transformative in nature.

The National Cybersecurity and Critical Infrastructure Protection Act of 2013 contains guidance, and further opens transparent dialogue in public-private information sharing in other sectors. Consider the ramifications should information security and assurance extend beyond the necessary requirements to meet existing regulatory requirements such as SOX, GLBA, etc., and beyond industry’s self-regulation in the form of PCI or BITS SAP.  Given the criticality of IT to most organizations, the basis of software behind most of these systems, and the emergent change in requiring some degree of accountability from the developers of these systems; it’s nearly a foregone conclusion that additional guidance will be developed, that reporting and audit will be required, and that positive control/traceability of control effectiveness will be necessary to protect firms from liability.

If we, as leaders, consider these scenarios to be reasonably likely; it stands to reason that we should be planning for disruptive changes in the development and delivery of solutions to our customers, and ensure appropriate control structures are developed and enforced internally.  For those of us that provide: business services, consulting, software engineering & integration, or any number of other services, it additionally stands to reason that the adoption of orchestrated and auditable processes, enabled by technological integration, reduces the cost of compliance; using the overhead to create value.  Additionally, the systematic application and quantitative management of these (as with any control framework) encourages continuous improvement to develop of capacity, opening the doors for the creation of a new set of offerings based on internal capabilities.

When disruptions within an industry is seen as opportunity for transformative change, rather than a threat to the bottom line; there is certainly rationale to begin development of new offerings as early as possible, and to become involved in shaping the final outcome of required compliance.

Particularly, situations such as this where there is potential that a large, sustained, and relatively unfilled market will soon exist; created through legislation akin to the financial and health auditors who validate compliance with SOX, GLBA, HIPPA, etc., it stands to reason that those firms adopting and developing these capabilities now will have a significant advantage.

Final Remarks

From a DoD contractor’s perspective it’s important to note that these issues are particularly problematic in low-price awards or variations where evaluators don’t have the flexibility to consider them within the evaluation of staffing, technical approach, or price realism assessments.

Given the fact that the majority of RFPs do not specifically include this type of language, yet the requirements still exist under FISMA, and by the inclusion of the FAR clause(s) requiring compliance with DoD 8500.2 by reference (or referenced reference) it is difficult to justify the resources needed to comply with all controls within a cost narrative or delivered Basis of Estimate (BoE).  Until mandatory compliance is set as an evaluated criteria in all contracts, rather than being included strictly within the statement of work there remains limited opportunity for execution of such a plan.  Similarly, until we, as contractors highlight this oversight (or until more severe breaches occur) – a change to include this language is similarly unlikely.  It still remains a valuable, and important, area of change in organizational process; with opportunities likely in various markets.

I’ll certainly revisit this subject in the near future as I develop the actual security plans, compliance matrices, overlap charts, and guides for framework integration.  In order for potential solutions to be applicable and available, my intent is to use open-source-ALM tools to orchestrate processes for development, tied in to ECM, BPM, and BI utilities for routing, reporting, knowledge management, and analysis.  Look for a review of the available solutions in each of these spaces as the next parts in this series.

//Levii

//Levii

 
1 Comment

Posted by on November 30, 2013 in Business, Information Technology

 

Tags:

 
%d bloggers like this: