RSS

A Holiday Break

With the holiday season upon us, I’ll be taking a break from writing original posts and/or papers for a couple of weeks.  I won’t be taking this time to rest on my laurels, however.  I have plenty of projects to complete around the house, a decade of photos to sort and upload, and a number of older works that need dusted off and posted so that there is a possibility of others getting value from them.

Happy Holidays

I’ve continued the love of open collaboration, and strive to share my knowledge and expertise freely to those who ask of it. As every new cohort began, I opened with the same statement; which I still hold to be a positive truth:

Though I may be billed as your teacher, trainer, or as the course instructor; none of these are necessarily true. As a room of professionals, each of you have chosen to be here. I cannot teach those that don’t want to learn, nor can I instruct those that don’t wish to listen. I can guide, assist, advise, listen, and mentor. I will freely answer any question to which I’m familiar and certain, and research answers to those I am not. I’m a resource, coach, and your collaborator; and I look forward to the next 40 days we’ll spend here together.

It’s in this spirit that I’ve selected the training decks from the CNAP program to post over the next couple of weeks. The first series to follow this post will be the Cisco Networking Academy Program (CNAP) training material, slides, labs, and other handouts that I prepared as an instructor between 2000 and 2005.  They’ve been touched on from time-to-time, and although no longer relevant for the CCNA exam; the underlying technical concepts and theory aren’t the type of things that will age into irrelevance in the very near future.

In my opinion, the development, mentorship, and transparency in thought-processes are among the most defining characteristics of good leadership. Though I have largely moved out of the classroom, the lessons learned and training received in instructional design, pedagogy, and confidence in my subject-authority will undoubtedly stick with me; and be valuable in all aspects of my life. I hope any good information I might impart while republishing this series may be as valuable to you as it was to me.

Happy holidays to all, and I  look forward to our continued conversation.

//Levii

 

DoD Contracts and EVM Requirements: Worth a Second Look for Program Managers

I’ve written somewhat frequently on the benefits of quantitative management of IT programs, and specifically on some of the problems faced by defense contractors in aligning industry best-practice to ANSI-748 EVM control requirements.  I’m certainly not the only voice that’s advocated for reform on the acquisitions side of the house, and there are more than a few good perspectives in: alignment with Agile practices, Earned Value for Business Value, CMMI support for EVM, and the various tools and processes to make it all work together.  I’d therefore be remiss, if I didn’t open this post up by giving credit where it’s due. My perspective in this domain has been heavily influenced by Dale G., Glen B. Alleman, Paul J. Solomon, all the contributors on the StackExchange network, and a significant number of papers in both the academic and practitioner literature.  For anyone interested, I’ll post my reference library as a BibTex entry at a later date, or feel free to contact me on any site I frequent and I’ll package up the pertinent parts to send you.

The issue of compliance has been particularly troublesome for those projects that are effectively fixed in terms of cost (i.e. the contract award value) and resources (i.e. the bid LoE supporting the effort), while the scope of any particular work package tends to be flexible (e.g. sustainment, maintenance, or Agile development). Though I’ve argued that the DMCA guidance and intent statement in conjunction with the ANSI specification itself both allow for the use of Technical Performance Measures (TPMs) as a method of tracking progress-to-plan; the assignment of value to technical tasks, construction of the Integrated Master Schedule (IMS), and development of the Work Breakdown Structure (WBS) is often constrained by the Statement of Work (SoW)/Performance Work Statement (PWS), or acquisition (AQ) process.

Thankfully, a footnote and minor change to the November 26, 2013 reissuance of DoD 5000.02, streamlines a lot of this effort, particularly for those programs under the $50M threshold. Though this particular language gives 180 days to redevelop the guidance, the interim instruction appears specifically intended to better facilitate the acquisitions process, and very well could be read to open the TPM method of metric utilization by aligning more to the Systems Engineering sub-processes.

DoD 5000.2 EVM Requirement - November 26, 2013

This change reinvigorates my original argument, and while the DCMA has allowed this type of cost-allocation to technical measures for some time; acquisitions (AQ) have been constrained in the application.  Realizing that many smaller programs bore an undue level of overhead generating financial metrics that didn’t necessarily trace back to the value being derived, and that numerous projects using EVM in a strict application to comply with AQ policy have failed provide effective measures for program managers, it seems that we’re finally getting there.  The option for AQ to define EV on technical measures can be used to tightly align industry best-practices (e.g. Agile, Kanbam, CMMI, ITIL, or ISO processes) back to the EVM standard, and enable us to price contracts more competitively.  Without the additional overhead and financial wizardry necessary to get a non-EVM program to report as if it were; and with the ability to “manage by the numbers” it opens an avenue for much more productive programs, and reduction in taxpayer funded waste.

The biggest surprise, however, lies in what’s missing.  A close look at the EVM requirements themselves, as outlined below shows that task orders above $20M but below $50M must use a system that’s compliant with the EVM guidelines (and there for the optional use of TPM), but not a formally validated EVMS.

For the near-future, it would certainly appear that smaller projects, and particularly those supporting more fluid requirements that are realistically best-described as Level of Effort and/or Time and Materials (T&M) types of contracts, have had the noose loosened a notch.  I’d certainly advise any PM in the industry that’s under this requirement to take a closer look … you might find that you’ve got the opportunity to improve not only your reporting, but the processes themselves to gain efficiency and analytic intelligence for continuous improvement.

As with everything else on this site, I hope to hear your thoughts. Contact me, or leave your comments below & I’ll do my best to get back to you.

//Levii

 
Leave a comment

Posted by on December 14, 2013 in Business

 

Tags: ,

Easily Remember Passwords with Keyphrases

After a week away from the office over Thanksgiving, a colleague had understandably forgotten their convoluted non-dictionary, non-repeating, non-sequential, non-patterned, mixed case, alphanumeric & special, 15 character password. Give it a week or two and who wouldn’t?

As they were trying to think of a memorable password that was strong enough password to meet policy, I thought of an article I’d published circa 2006, and thought it was worth putting back out. It’s hard to believe it was so long ago, but that aside, it’s not a new subject and was the focus of one of my favorite XKCD comics, on slashdot in a discussion of alternate password systems, and over at Scheier on Security as a perspective on the secret question.

Keyphrase Chart Though I initially visited the subject before the ubiquity of smart devices, when I was more involved in the day-to-day system and network administration of organizations, it still seemed relevant as I was considering it today. As many people aren’t using systems like KeePass, can’t carry personally owned internet devices into their workplace, or simply aren’t comfortable with truly random passwords, this article represents the same approach, redone in a different code base simply to demonstrate the concept. Truthfully though, it fairly simple and reflects a mix of two of the oldest methods of concealing messages: the substitution and the book ciphers. The idea was, and remains, to allow users a straightforward tool to use short phrases that are easily memorable, and to use those as a key for more complex passwords, and to make cycling passwords as simple as grabbing a new wallet-sized keyphrase card. In the original incarnation, I printed two (2) of each card so people could keep a copy in a safe location, and carry the other with them. This allows strong passwords on the system, while keeping it easy to remember.

The approach is a relatively straightforward substitution cipher benefited by a randomized card that’s unique to each user. Carry the card (and effectively your passwords) with you, stick it under a keyboard, tape it to your monitor, etc. All you need remember is the simple pass-phrase. The same key can be continuously used, so the word “WORK” becomes an eight letter alphanumeric & special character password, maintaining a reasonable degree of security. Since the actual key is easily remembered, stick cards for different purposes (i.e. work vs. home) at different locations, or keep a backup copy of the card left in an accessible location … without significant risk of password compromise. Furthermore, regular password changes and more likely since the key itself doesn’t need to be regenerated, and there’s less concern about forgetting a bunch of new passwords.

There are certainly changes I’d make to this in a production environment; I’d imagine that plastic cards, having a unique chart printed on the back of a business card, or having a digital version would be an improvement over the stack of printed cards I’d used before. I’m sharing this as I rework some of my whitepapers and other concepts into what is hopefully useful content, and to contextualize them to spark discussion. With the state of existing systems, it’s an idea worth further evaluation; as a potential enhancement, or alternative, to the secret question, the site seals that are growing in popularity, or the keypad entry points that aren’t used near frequently enough. Without threat modeling and algorithmic analysis, the biggest concern I’d focus on would be shoulder-surfing as a means to learn the key, and the fact that all passwords are the same length; which might simplify crypanalysis. Both of which, are however, relatively straightforward changes to the design.

The output is here for anyone interested in the method, and for the sake of completeness the source used to generate the output is posted below. As always I welcome any discussion and feedback on applications, how you manage the overhead of password complexity and human memory, or anything else you might feel like throwing my way.

 

Negligent Entrustment – Revisted Thoughts for Software Development Contractors

Securing Everything as a Service

The Software, the Process, and FISMA Compliance

In the world of XaaS, the internet of things, and particularly in light of recent developments in the cyber-security language of contracts I’ve recently reviewed for capture; I’ve spent some time revisiting the literature and my thoughts surrounding this brief essay I’d written in early 2012 as an argument against the application of negligent entrustment in outsourc(ed|ing) IT.

Due, in part, to the presedential cybersecurity directives  and appearing to be in response to changes in the National Defense Authorization Act, anticipated changes to the DFARS (originating with 2011-D039), and the continued focus on cyber security within all national sectors; a renewed focus on the development lifecycles and standards of programs warrants review and increased level-of-attention that extends beyond QA and EVM.

From a broader perspective, it becomes necessary to fully explore secure development lifecycles (SDL), the role of change management (CM) and the application of supporting standards, frameworks, or models (e.g. ANSI-748 EVM, Agile EVM, Scrum, CMMI, Microsoft’s SDL, etc.) in governing program execution, while facilitating high-performing teams.

Background

At issue, and the bits that got me working back into this subject, are the concepts of warranty when included in contract language and the very real potential of severe ramifications for contractors and consultants alike. Should they be found failing in application of governance and policy the penalty can vary anywhere from a poor contract rating, which impacts future capture, to payment refunds, or the ultimate death penalty of disbarment from award eligibility.

For reference, an example from the US Transportation Command is provided as follows:

The contractor represents and warrants that the software shall be free from all computer viruses, worms, time-outs, time bombs, back doors, disabling devices and other harmful or malicious code intended to or which may damage, disrupt, inconvenience or permit access to the software user's or another's software, hardware, networks, data or information.

For the purposes of this short paper; the goal is to begin a discussion on frameworks supporting effective, secure, and practical development of software that meet all necessary areas of compliance.  Under primary consideration are regulatory, contractual, and corporate governance; with methods (either technological or procedural) of audit, verification, and review that can (or should) be included in the process to ease implementation through incorporation into a standard workflow.  The intent is to share my thoughts on integration and the effectiveness of aligning the multitude of control requirements in ways that don’t negatively impact project execution and velocity.

Discussion

While seemingly straightforward, there are second and third-order considerations that are often overlooked during capture, evaluation, and execution of programs that should require any of us in business, IA, software development, or any other affected field to take pause.

Take, for example, a malicious employee who fears termination. It’s possible (and some might argue likely) that they slip a backdoor into a computer program.  While this certainly represents a criminal act; an entity that fails in applying their own policies or processes, or simply fails to implement effective controls can be held liable – and at great cost.  Similarly, malicious injection on the part of a third party would most-likely be a criminal act; failure to identify the injected source, and to ensure that delivered application code is “free from all …” remains the duty of the contracted party.

Given the increased attention on industry standards (e.g. ISO 27001, ANSI-748, etc.), cyber-security, and other domains of interest; issues of liability arise related to what may initially seem to be otherwise unrelated.  CMMI, for instance wouldn’t appear to be a protection against disbarment any more than effective Agile development supports EVM. Viewed holistically, however, each builds upon the other to develop a new kind of “defense in depth” where the defense isn’t strictly within the realm of cyber, but extends into effective quantitative management of programs, and reasonable review of data to ensure a firm is practicing all necessary due diligence and control of program efforts.

These issues could be considered secondary to those that can be accommodated by best-practices in development and quality assurance (QA) and specified through frameworks and maturity models such as CMMI.  When considering the following, however, they necessarily should be included.

If the Government determines, after a security audit (e.g. ST&E), that software delivered under this task order is non-secure, the Government will provide written notice to the contractor of each non-conformity.

Software shall be "non-secure" under this task order if it contains a programming error listed on the current approved version of the CWE/SANS TOP 25 (which can be located at http ://www.sans.org/top25-programming-errors) or a web application security flaw listed on the current approved version of the OW ASP Top Ten (which can be located at http: //www.owasp.org/index.php/Category:OWASP Top_ Ten_Project).

Given that traceable changes to source (SCM), peer review, and clear documentation are among the most effective methods to validate requirements, prevent malicious injection, and to reconcile findings from static analysis; it’s worth noting that these are key concepts within most software QA and CPI frameworks.  This traceability additionally supports progress-to-plan reporting and quantitative program management.  At the end of the day then, integrating and orchestrating existing best practices from the major domains of governance, serve the goals of information security in this context.

To achieve convergence, information security professionals need a new way of thinking and supporting frameworks and tools that describe a greater role of governance applications.  Akin to the growth of the business analyst’s role, often including systems, requirements, and process engineering subdomains, IA professionals must remain aware of the broader scope of technological and procedural avenues to achieving compliance.  We must additionally understand the industry, applicable regulatory and legal issues, and the goals of the enterprise well enough to support efforts in process re-engineering in ways that can deliver value.

A New Market Advantage?

While this type of compliance, if enforced, could certainly be considered a program risk; I’d prefer to think of it as a strategic opportunity for those that lean forward in developing their capabilities in software and systems audit.  While all companies necessarily innovate, what I suggest is more transformative in nature.

The National Cybersecurity and Critical Infrastructure Protection Act of 2013 contains guidance, and further opens transparent dialogue in public-private information sharing in other sectors. Consider the ramifications should information security and assurance extend beyond the necessary requirements to meet existing regulatory requirements such as SOX, GLBA, etc., and beyond industry’s self-regulation in the form of PCI or BITS SAP.  Given the criticality of IT to most organizations, the basis of software behind most of these systems, and the emergent change in requiring some degree of accountability from the developers of these systems; it’s nearly a foregone conclusion that additional guidance will be developed, that reporting and audit will be required, and that positive control/traceability of control effectiveness will be necessary to protect firms from liability.

If we, as leaders, consider these scenarios to be reasonably likely; it stands to reason that we should be planning for disruptive changes in the development and delivery of solutions to our customers, and ensure appropriate control structures are developed and enforced internally.  For those of us that provide: business services, consulting, software engineering & integration, or any number of other services, it additionally stands to reason that the adoption of orchestrated and auditable processes, enabled by technological integration, reduces the cost of compliance; using the overhead to create value.  Additionally, the systematic application and quantitative management of these (as with any control framework) encourages continuous improvement to develop of capacity, opening the doors for the creation of a new set of offerings based on internal capabilities.

When disruptions within an industry is seen as opportunity for transformative change, rather than a threat to the bottom line; there is certainly rationale to begin development of new offerings as early as possible, and to become involved in shaping the final outcome of required compliance.

Particularly, situations such as this where there is potential that a large, sustained, and relatively unfilled market will soon exist; created through legislation akin to the financial and health auditors who validate compliance with SOX, GLBA, HIPPA, etc., it stands to reason that those firms adopting and developing these capabilities now will have a significant advantage.

Final Remarks

From a DoD contractor’s perspective it’s important to note that these issues are particularly problematic in low-price awards or variations where evaluators don’t have the flexibility to consider them within the evaluation of staffing, technical approach, or price realism assessments.

Given the fact that the majority of RFPs do not specifically include this type of language, yet the requirements still exist under FISMA, and by the inclusion of the FAR clause(s) requiring compliance with DoD 8500.2 by reference (or referenced reference) it is difficult to justify the resources needed to comply with all controls within a cost narrative or delivered Basis of Estimate (BoE).  Until mandatory compliance is set as an evaluated criteria in all contracts, rather than being included strictly within the statement of work there remains limited opportunity for execution of such a plan.  Similarly, until we, as contractors highlight this oversight (or until more severe breaches occur) – a change to include this language is similarly unlikely.  It still remains a valuable, and important, area of change in organizational process; with opportunities likely in various markets.

I’ll certainly revisit this subject in the near future as I develop the actual security plans, compliance matrices, overlap charts, and guides for framework integration.  In order for potential solutions to be applicable and available, my intent is to use open-source-ALM tools to orchestrate processes for development, tied in to ECM, BPM, and BI utilities for routing, reporting, knowledge management, and analysis.  Look for a review of the available solutions in each of these spaces as the next parts in this series.

//Levii

//Levii

 
1 Comment

Posted by on November 30, 2013 in Business, Information Technology

 

Tags:

Site Redevelopment

Though this site doesn’t have a substantial following, I do have some resources linked here and photos shared for friends, family and colleagues – so I did want to note that this site will be undergoing an overhaul in the next couple of weeks.  I’m finally moving off of a legacy Joomla! installation and onto WordPress.  There are a variety of reasons, but the most significant is to enhance media sharing and to more easily manage content and whitepapers that I’m putting together from academic, personal and professional research.

I’ve been working quite a bit with some of the open source ERP systems; specifically ERPNext and OpenERP, which are both python based so a move to a VPS is in order on the DreamHost side – which will also allow me tighter integration with my various Windows Azure resources.  Python itself is new to me, as most of my development has been done in C#, ColdFusion, .JS, ASP, VBS, Perl, or Java … so this should be a good chance for me to work in what’s rapidly becoming the defacto language for scientific applications.  I will be digging out source, applications, papers, and training materials to post as part of this effort — so it might be a bit clunky for a bit while I develop a taxonomy of content … if I’m lucky I might even be able to get Tara to post!

While I’m at it, I’m working on cleaning up the site to organize my papers, publications, work, etc., as a part of building out and demonstrating my personal brand.  I’ll come back to this later, but for now I’m testing the integration and links between sites.  Starting out with my  , I’ll be pulling all of my online presence together here to ease management & maintain ownership/control of the content.

Keep in touch.

//Levii

 
Leave a comment

Posted by on November 2, 2013 in General Informaion

 

Tags:

Merging Word documents … What a pain

Merging Word documents … What a pain

For the umpteenth time in my career I had a document in MS Word (2010 in this case) that was reviewed by multiple individuals (about 8) with changes to formatting, content, etc. along with reviewer comments. These then had to be merged back into a single document, which is certainly easier than it used to be – but is still incredibly inefficient. Let’s look at the normal flow:

  1. Document Created (O1)
  2. Send to Review (email or ??)
  3. Receive n copies of the document back (R1 – Rn)
  4. Make a copy of the original doc (O2) and combine with the revised copy R1 [in the original document]
  5. Save O2 and close both files (or all of the merge panes).
  6. Select the combine option again, find O2 and merge in R2.
  7. Rinse and repeat steps 5 & 6 through Rn.

Certainly using the version tracking of SharePoint 2010/2013 would be considerably easier while also allowing for concurrent editing, but that’s not always an option. Thinking of a large organization where this is done many hundreds of times each day, by thousands of employees, I was surprised that a small utility to perform the function of “merge these N files with O1” didn’t pop up on a quick Google/StackOverflow search. In my one instance, I missed a file … or messed something up the first time through which killed about 20 minutes of my day that could have been otherwise productive by actually addressing the commentary. To that end I decided to do something about it which I’ll share in source once I’ve gone through it enough that it wouldn’t be embarrassing.  It’s gotten a little bit cleaner, and a whole lot faster – but since it’s based on InterOp it’s a temporary solution until the features is fully implemented in Eric White’s EXCELLENT OpenXmlPowerTools library (link to the issue).

For the time being we’ll just call this WordMerge, and runs as a standalone executable. Not a heap of validation and error checking in place at the moment (e.g. bad output path could be typed in which will throw a general exception), but it has been through the paces and works pretty well up to the 50 documents I tried. For my use, I’m perfectly happy with how it runs – but if anyone else grabs the concept before I get code out there; I’m open to suggestions (a SharePoint extension, Office Plugin, and a WPF version are already planned BTW).

You can see the interface in the screenshot below … not much to explain.

wordmerge screenshot

Certainly, it isn’t foolproof to merge everything in at once. If a person moved a lot of content, and other reviewers modified that content; MS Word would be unaware of the “right” order to resolve conflicts. In these cases there may be straggling words that weren’t part of the move (esp if the move was processed after the changes in the merge order), which I’ve tried to address by allowing a re-order in the merged documents list.  That notwithstanding, thankfully the ability to toggle reviewers and types of changes in the review pane makes reconciling a much simpler process.

I’d be interested in how you normally handle disparate review documents … one at a time and copy what you’d like to keep over, merge them all by hand and accept/reject change, or something else entirely?  I’ll pull thoughts into the merger and possibly tie it with some other work I’m contemplating that uses NLP parsing systems in conjunction with context-free grammar generation to assist edits and rewrites from a Knowledge Management repository.  That’s definitely further out on the horizon though.  Next up is a syntax highlighter for Word – simply because I’m tired of the wasted time and inconsistent formatting in software documentation that inevitably results if you’ve got to embed source.  If there are suggestions for that before I get it released, feel free to send them my way.

Grab it, use it, save yourself some headache. If you’ve got feedback for me, let me know.

//Levii

 

Qiqqa Review

Been pretty impressed with the current generation of Reference management software. Looking at Qiqqa now. Premium looks slick. I’ll come back for a further review, but if you want to check it out before I get to that point, use this link and you’ll get me a free month of premium, and two weeks free for yourself. Don’t worry, no credit-cards or anything like that are required; nor does the free version cost a thing. Probably worth looking into if you’re just now starting to get into reference management solutions.

The Link: http://www.qiqqa.com/Register/106431

//Levii

 
Leave a comment

Posted by on July 17, 2013 in Information Technology

 

Tags:

 
%d bloggers like this: