RSS

Category Archives: Business

Negligent Entrustment – Revisted Thoughts for Software Development Contractors

Securing Everything as a Service

The Software, the Process, and FISMA Compliance

In the world of XaaS, the internet of things, and particularly in light of recent developments in the cyber-security language of contracts I’ve recently reviewed for capture; I’ve spent some time revisiting the literature and my thoughts surrounding this brief essay I’d written in early 2012 as an argument against the application of negligent entrustment in outsourc(ed|ing) IT.

Due, in part, to the presedential cybersecurity directives  and appearing to be in response to changes in the National Defense Authorization Act, anticipated changes to the DFARS (originating with 2011-D039), and the continued focus on cyber security within all national sectors; a renewed focus on the development lifecycles and standards of programs warrants review and increased level-of-attention that extends beyond QA and EVM.

From a broader perspective, it becomes necessary to fully explore secure development lifecycles (SDL), the role of change management (CM) and the application of supporting standards, frameworks, or models (e.g. ANSI-748 EVM, Agile EVM, Scrum, CMMI, Microsoft’s SDL, etc.) in governing program execution, while facilitating high-performing teams.

Background

At issue, and the bits that got me working back into this subject, are the concepts of warranty when included in contract language and the very real potential of severe ramifications for contractors and consultants alike. Should they be found failing in application of governance and policy the penalty can vary anywhere from a poor contract rating, which impacts future capture, to payment refunds, or the ultimate death penalty of disbarment from award eligibility.

For reference, an example from the US Transportation Command is provided as follows:

The contractor represents and warrants that the software shall be free from all computer viruses, worms, time-outs, time bombs, back doors, disabling devices and other harmful or malicious code intended to or which may damage, disrupt, inconvenience or permit access to the software user's or another's software, hardware, networks, data or information.

For the purposes of this short paper; the goal is to begin a discussion on frameworks supporting effective, secure, and practical development of software that meet all necessary areas of compliance.  Under primary consideration are regulatory, contractual, and corporate governance; with methods (either technological or procedural) of audit, verification, and review that can (or should) be included in the process to ease implementation through incorporation into a standard workflow.  The intent is to share my thoughts on integration and the effectiveness of aligning the multitude of control requirements in ways that don’t negatively impact project execution and velocity.

Discussion

While seemingly straightforward, there are second and third-order considerations that are often overlooked during capture, evaluation, and execution of programs that should require any of us in business, IA, software development, or any other affected field to take pause.

Take, for example, a malicious employee who fears termination. It’s possible (and some might argue likely) that they slip a backdoor into a computer program.  While this certainly represents a criminal act; an entity that fails in applying their own policies or processes, or simply fails to implement effective controls can be held liable – and at great cost.  Similarly, malicious injection on the part of a third party would most-likely be a criminal act; failure to identify the injected source, and to ensure that delivered application code is “free from all …” remains the duty of the contracted party.

Given the increased attention on industry standards (e.g. ISO 27001, ANSI-748, etc.), cyber-security, and other domains of interest; issues of liability arise related to what may initially seem to be otherwise unrelated.  CMMI, for instance wouldn’t appear to be a protection against disbarment any more than effective Agile development supports EVM. Viewed holistically, however, each builds upon the other to develop a new kind of “defense in depth” where the defense isn’t strictly within the realm of cyber, but extends into effective quantitative management of programs, and reasonable review of data to ensure a firm is practicing all necessary due diligence and control of program efforts.

These issues could be considered secondary to those that can be accommodated by best-practices in development and quality assurance (QA) and specified through frameworks and maturity models such as CMMI.  When considering the following, however, they necessarily should be included.

If the Government determines, after a security audit (e.g. ST&E), that software delivered under this task order is non-secure, the Government will provide written notice to the contractor of each non-conformity.

Software shall be "non-secure" under this task order if it contains a programming error listed on the current approved version of the CWE/SANS TOP 25 (which can be located at http ://www.sans.org/top25-programming-errors) or a web application security flaw listed on the current approved version of the OW ASP Top Ten (which can be located at http: //www.owasp.org/index.php/Category:OWASP Top_ Ten_Project).

Given that traceable changes to source (SCM), peer review, and clear documentation are among the most effective methods to validate requirements, prevent malicious injection, and to reconcile findings from static analysis; it’s worth noting that these are key concepts within most software QA and CPI frameworks.  This traceability additionally supports progress-to-plan reporting and quantitative program management.  At the end of the day then, integrating and orchestrating existing best practices from the major domains of governance, serve the goals of information security in this context.

To achieve convergence, information security professionals need a new way of thinking and supporting frameworks and tools that describe a greater role of governance applications.  Akin to the growth of the business analyst’s role, often including systems, requirements, and process engineering subdomains, IA professionals must remain aware of the broader scope of technological and procedural avenues to achieving compliance.  We must additionally understand the industry, applicable regulatory and legal issues, and the goals of the enterprise well enough to support efforts in process re-engineering in ways that can deliver value.

A New Market Advantage?

While this type of compliance, if enforced, could certainly be considered a program risk; I’d prefer to think of it as a strategic opportunity for those that lean forward in developing their capabilities in software and systems audit.  While all companies necessarily innovate, what I suggest is more transformative in nature.

The National Cybersecurity and Critical Infrastructure Protection Act of 2013 contains guidance, and further opens transparent dialogue in public-private information sharing in other sectors. Consider the ramifications should information security and assurance extend beyond the necessary requirements to meet existing regulatory requirements such as SOX, GLBA, etc., and beyond industry’s self-regulation in the form of PCI or BITS SAP.  Given the criticality of IT to most organizations, the basis of software behind most of these systems, and the emergent change in requiring some degree of accountability from the developers of these systems; it’s nearly a foregone conclusion that additional guidance will be developed, that reporting and audit will be required, and that positive control/traceability of control effectiveness will be necessary to protect firms from liability.

If we, as leaders, consider these scenarios to be reasonably likely; it stands to reason that we should be planning for disruptive changes in the development and delivery of solutions to our customers, and ensure appropriate control structures are developed and enforced internally.  For those of us that provide: business services, consulting, software engineering & integration, or any number of other services, it additionally stands to reason that the adoption of orchestrated and auditable processes, enabled by technological integration, reduces the cost of compliance; using the overhead to create value.  Additionally, the systematic application and quantitative management of these (as with any control framework) encourages continuous improvement to develop of capacity, opening the doors for the creation of a new set of offerings based on internal capabilities.

When disruptions within an industry is seen as opportunity for transformative change, rather than a threat to the bottom line; there is certainly rationale to begin development of new offerings as early as possible, and to become involved in shaping the final outcome of required compliance.

Particularly, situations such as this where there is potential that a large, sustained, and relatively unfilled market will soon exist; created through legislation akin to the financial and health auditors who validate compliance with SOX, GLBA, HIPPA, etc., it stands to reason that those firms adopting and developing these capabilities now will have a significant advantage.

Final Remarks

From a DoD contractor’s perspective it’s important to note that these issues are particularly problematic in low-price awards or variations where evaluators don’t have the flexibility to consider them within the evaluation of staffing, technical approach, or price realism assessments.

Given the fact that the majority of RFPs do not specifically include this type of language, yet the requirements still exist under FISMA, and by the inclusion of the FAR clause(s) requiring compliance with DoD 8500.2 by reference (or referenced reference) it is difficult to justify the resources needed to comply with all controls within a cost narrative or delivered Basis of Estimate (BoE).  Until mandatory compliance is set as an evaluated criteria in all contracts, rather than being included strictly within the statement of work there remains limited opportunity for execution of such a plan.  Similarly, until we, as contractors highlight this oversight (or until more severe breaches occur) – a change to include this language is similarly unlikely.  It still remains a valuable, and important, area of change in organizational process; with opportunities likely in various markets.

I’ll certainly revisit this subject in the near future as I develop the actual security plans, compliance matrices, overlap charts, and guides for framework integration.  In order for potential solutions to be applicable and available, my intent is to use open-source-ALM tools to orchestrate processes for development, tied in to ECM, BPM, and BI utilities for routing, reporting, knowledge management, and analysis.  Look for a review of the available solutions in each of these spaces as the next parts in this series.

//Levii

//Levii

 
1 Comment

Posted by on November 30, 2013 in Business, Information Technology

 

Tags:

Merging Word documents … What a pain

Merging Word documents … What a pain

For the umpteenth time in my career I had a document in MS Word (2010 in this case) that was reviewed by multiple individuals (about 8) with changes to formatting, content, etc. along with reviewer comments. These then had to be merged back into a single document, which is certainly easier than it used to be – but is still incredibly inefficient. Let’s look at the normal flow:

  1. Document Created (O1)
  2. Send to Review (email or ??)
  3. Receive n copies of the document back (R1 – Rn)
  4. Make a copy of the original doc (O2) and combine with the revised copy R1 [in the original document]
  5. Save O2 and close both files (or all of the merge panes).
  6. Select the combine option again, find O2 and merge in R2.
  7. Rinse and repeat steps 5 & 6 through Rn.

Certainly using the version tracking of SharePoint 2010/2013 would be considerably easier while also allowing for concurrent editing, but that’s not always an option. Thinking of a large organization where this is done many hundreds of times each day, by thousands of employees, I was surprised that a small utility to perform the function of “merge these N files with O1” didn’t pop up on a quick Google/StackOverflow search. In my one instance, I missed a file … or messed something up the first time through which killed about 20 minutes of my day that could have been otherwise productive by actually addressing the commentary. To that end I decided to do something about it which I’ll share in source once I’ve gone through it enough that it wouldn’t be embarrassing.  It’s gotten a little bit cleaner, and a whole lot faster – but since it’s based on InterOp it’s a temporary solution until the features is fully implemented in Eric White’s EXCELLENT OpenXmlPowerTools library (link to the issue).

For the time being we’ll just call this WordMerge, and runs as a standalone executable. Not a heap of validation and error checking in place at the moment (e.g. bad output path could be typed in which will throw a general exception), but it has been through the paces and works pretty well up to the 50 documents I tried. For my use, I’m perfectly happy with how it runs – but if anyone else grabs the concept before I get code out there; I’m open to suggestions (a SharePoint extension, Office Plugin, and a WPF version are already planned BTW).

You can see the interface in the screenshot below … not much to explain.

wordmerge screenshot

Certainly, it isn’t foolproof to merge everything in at once. If a person moved a lot of content, and other reviewers modified that content; MS Word would be unaware of the “right” order to resolve conflicts. In these cases there may be straggling words that weren’t part of the move (esp if the move was processed after the changes in the merge order), which I’ve tried to address by allowing a re-order in the merged documents list.  That notwithstanding, thankfully the ability to toggle reviewers and types of changes in the review pane makes reconciling a much simpler process.

I’d be interested in how you normally handle disparate review documents … one at a time and copy what you’d like to keep over, merge them all by hand and accept/reject change, or something else entirely?  I’ll pull thoughts into the merger and possibly tie it with some other work I’m contemplating that uses NLP parsing systems in conjunction with context-free grammar generation to assist edits and rewrites from a Knowledge Management repository.  That’s definitely further out on the horizon though.  Next up is a syntax highlighter for Word – simply because I’m tired of the wasted time and inconsistent formatting in software documentation that inevitably results if you’ve got to embed source.  If there are suggestions for that before I get it released, feel free to send them my way.

Grab it, use it, save yourself some headache. If you’ve got feedback for me, let me know.

//Levii

 

The Economics and Risks of Electronic Employee Monitoring

Employees and other inside personnel pose among the largest risks to an organization.  The overall profile of risk exists across a continuum that includes losses based on employee productivity, breach of confidentiality, data integrity, and overall availability of organizational resources.  It is therefore essential that a firm develop and maintain an effective posture for mitigating these risks that are both compliant with statutory requirements and limitations, while balancing the rights of employees to maintain a modicum of privacy in and out of the workplace.

 
Leave a comment

Posted by on February 1, 2013 in Business, Information Technology

 

Tags:

Outsourced Data Handling: Brief Thoughts of Negligent Entrustment Applicability

I find this subject area to be among the most interesting and fluid in IT today, as the area where law meets technology is still being formed in the US. Our system of common law combined with the speed of our legislative process (or lack thereof) compared to that of technological innovation leaves gaps in legal findings that are being tried on a regular basis to develop judicial precedence. Precedence then being the interpretation of a tort, as compared to what a judge may find similar in other cases when combined with the doctrine of Stare Decisis (until opposing legislation is enacted) covers these subjects more than the actual state of the congressional acts or agency regulation. It is an incredibly complicated subject where many educated in law are not technologists, and many technologists have an inadequate education in business law.

Negligent entrustment is covered under state civil codes within the personal injury set of torts (Kionka, 1999). What Rustad and Koenig(2007) detail are what might someday become issues of liability to a US company, not necessarily those that are currently primary liability concerns. It is still an interesting thought exercise and set of items that must be evaluated to form an effective risk profile of outsourcing activities. Also, since an individual is entitled to sue for anything the costs for an organization to protect itself from suit, even in those cases where it is settled or on, could be substantial. In order to reduce these eventualities the due care and due diligence of audit and contract enforcement, validation of contract performance measurements, and adherence to the law of the land the data originates, must be among the foremost concerns of the CIO, CISO and compliance officer. The ways in which courts rule on data in the coming years have the potential to affect the IT and software industry in the US. A finding in favor of suit for negligent entrustment in outsourced data would set dangerous precedent, deeming the data itself a dangerous item. This concept could then easily bleed into many other areas to include software development liability (which is cause for another paper entirely). While there have been attempts to prosecute authors of hacking tools with criminal offenses, to date, it has been upheld as activity covered under the 1st Amendment – but should that protection fall, all users and creators of data would have significant potential liabilities foisted upon them

As pointed out by Rustad and Koenig (2007) the incidence of lawsuits brought for negligent security practices is on the rise, but all cases that have resulted in award have been the result of direct liability. In the US, courts based on the research conducted and literature reviewed, there have yet to be any cases of indirect liability or negligent entrustment decided since negligence itself is specific to failure of due care (in absence of strict liability statutes) and there basis for negligent entrustment when the instrument itself is not directly capable of causing harm (Kionka, 1999). Additionally, when there is a superseding cause that would not have been reasonably foreseeable there would not be an issue of liability. If however, the outsourcing operations fail to include the due diligence and due care of a reasonable man, then the superseding cause argument would fail, and liability would revert to the company as the tortfeasor since reasonable expectation of data breach would likely meet the requirements for a proximate cause liability claim (Clarkson, Miller & Jentz, 2003).

Data breaches are inevitable (Huang, Hu & Behara, 2008), and the California Appellate court found that this claim of inevitability on the part of the claimant, or that evidence of that inevitability could be used to show that the negligence of the defendant, if any, is not a proximate cause on the part of the defendant (Smith v. San Francisco, 1953). While it remains essential to protect data under direct liability scenarios where a failure to exercise due care can be actionable, imagine a world where this was not the case, or where the ability to exclude warranty of merchantability and/or suitability for purpose could not be accomplished through contract. Oracle Corporation might be held liable because their software held a vulnerability that was exploited by a hacker attacking a bank, or Microsoft could be sued if a workstation crashed and lost some personal data. As highlighted by Ferrera, Lichtenstein, Reder, Bird and Schiano (2004) both of these situations can be shown to have actual damages, though the effect of allowing such inevitable actions to fall back to the progenitor of the system would have chilling repercussions for all transactions and systems in the digital world.

 

References

Ferrera, L. R. (2004). CyberLaw, Text and Cases 2nd Edition. Thomson Corporation.

Huang, C., Hu, Q., & Behara, R. S. (2008). An economic analysis of the optimal information security investment in the case of a risk-averse firm. International Journal of Production Economics, 114(2), 793–804. doi:10.1016/j.ijpe.2008.04.002

Kionka, Edward J. 1999. Torts in a Nutshell. 3d ed. St. Paul, Minn.: West Group

Rustad, M. L., & Koenig, T. H. (2007). Negligent entrustment liability for outsourced data. Journal of Internet Law, 10(10), 3–6. Retrieved from http://web.ebscohost.com.library.capella.edu/ehost/detail?sid=e04a605f-8b74-40ec-8586-a33effec288c@sessionmgr115&vid=1&hid=112&bdata=JnNpdGU9ZWhvc3QtbGl2ZSZzY29wZT1zaXRl#db=bth&AN=24619583

Smith v. San Francisco, 117 Cal. App. 2d 749, 256 P.2d 999 (1953)

Clarkson, K. W., Miller, R. L., & Jentz, G. A. (2003). West’s Business Law Text and Cases, 9th . Thompson Learning.

 
1 Comment

Posted by on November 17, 2012 in Business, Information Technology

 

Secure Software Development Environments

This paper is sourced from my Master’s thesis which covers what, at the time, I found to be a gap in major security frameworks in addressing the environment/enclave for the development of source. Note that this does not attempt to repeat considerable research involving the development and engineering practices of writing the code in as much as it attempts to provide a framework, background and discussion for the environment used in that process. This paper is a bit dated, and at some point I do hope to refresh it to include virtual routing and logically secured virtual machines / virtual desktop infrastructures.

I warn you in advance, this is written more like a whitepaper than formal research (as adapted) and could use considerable cleanup. It is still relevant and has a bit of good information for those looking to undertake such a task though.

That said, the document is available here

//Levii

 
 
 
%d bloggers like this: