RSS

Category Archives: Business

Any advice on approaching VC or incubators prior to patent?

Is there anyone that’s been through the patent process as an investor, or as the inventor/business entity that has some advice or light to share?

I have more than a few patentable concepts for business method patents, and from my own preliminary search would be first-to-file on each.  While I’ve got the ability to implement them on my own from a technical standpoint, I find myself short on time (the PhD is a time-consuming monster), capital, and experience in the process of commercialization.

Since I realize that ideas are essentially worthless … I’m soliciting any advice, partnership, mentor, or other guidance from someone that’s been through the process in order to bring what I think is among the best of my backlog to market.

Background

While I’ve been expanding my marketing and branding experience through the process of launching the LiquiCig brand and The Vaping Shop as a distributor and retail company to fund the brand grown; I’ve also been helping Tara expand the business itself. I’ve been taking notes of gaps and difficulties in small-business platforms as I go; and in conjunction with my academic research I’ve been developing a framework and set of methods, systems, and technologies to fill them (more on that in a different post).

Recently, I identified what I think is a opportunity in a method to significantly improve on the ability to voluntarily collect valuable information in a highly targeted manner, and the simplicity of execution/distribution has brought developing the method as a platform and licensable piece of intellectual property to #1 of my list of supporting activities.

Applicability and Scope

The method would be applicable to marketing and data collection for any organization, and on any platform. They could add significant value to providers like SurveyMonkey and be internally applicable to organizations in developing their own visible dashboard of capabilities and experience.  It could be licensed as a method, or provided as an SaaS API to integrate with client platforms; and would be valuable to the enterprise holding the data (on the SaaS side) in generating independent revenue streams.

The closest competitor that I can think of is Google’s AdWords & I’m confident that with a little concept refinement – this method (and technology platform to drive adoption) has a larger scope & more accurate method of targeting as part of it.

Conclusion

So my question is … what’s next? I’m slightly capital constrained and am not personally familiar with any VC firms / investor groups that support patent search and development as licensees (or leaving me just as a royalty holder at the end).  The amount of advice online is mind-boggling and frequently contradictory. Any advice, partnership, mentorship, or other guidance from someone that’s been through the process would be truly appreciated.

//Levii

 

 
Leave a comment

Posted by on May 3, 2014 in Business

 

Invited to Present at the UIS Cyber Defense and Disaster Recovery Conference

CCDR

It’s been a while since I’ve posted anything of substance … being buried in transitioning a new 24-person software development contract, academic papers for the PhD program, and bid & proposal efforts, it’s likely to be a bit yet.  Even so, I thought it worth posting that I’ve been invited to present at the 2014 Cyber Defense and Disaster Recovery Conference (CDDR) at the University of Illinois Springfield (UIS) on Thursday, March 13.  There should be a mid sized (~225+) paid audience of practitioners, business people & academics.  The event is sponsored by InfraGard Springfield with coordination and support by the FBI and UIS.

For more information on the conference, take a look at their archives – they’ve had a great list of presentations and speakers; and I’d certainly be privileged to be among them.

The theme I’ve been asked to present is on security incident and response as firms migrate to distributed storage technologies, and I just about proposed a presentation title of The bits be everywhere – keeping them in their tubes & cleaning up the mess when they spring a leak.  I did come to my senses, and the actual topic is TBA after I’ve confirmed my availability, but the general subject remains the same & is something I’ve written around the edges of on this blog in the last year.  The working title (at least for the moment) is a bit more professional, and is something on the order of Minimally intrusive governance & distributed storage systems: Considerations for disaster recovery and contingency planning in a mobile world.

Knowing that there will be small business leaders in attendance, and having been asked to make the presentation instructional; I’m tempted to fall back to the broader areas of governance, compliance & risk.  When considering the ways that varied attendees might prepare for security and incident response, and the answer being “it depends”; I think a broader perspective of the criticality of good governance and orchestrated process of BCP/DRP specific to distributed data storage should be appropriate.  If there’s one thing I know, it’s that awareness of security is insufficient, as is the presentation of a solid business case.  The competing priorities of security and workflow efficiency must be addressed or people will work around the controls.  Though a recent area of study, Albrechsten (2007) and Takemura (2011) both provide very good evidence of this, with an identified need to blend not only awareness, but the practical actions that can be embedded into process, without significant impact to the overall efficiency of operations.

Viewed from the broader perspective, these are not easy challenges to solve; and unfortunately the problem is less frequently technological in nature, and more often is tied to the behavior of the organization itself.   Addressing change in technology, workflow, and culture (regardless of reason) require a more deeply rooted desire to change behavior patterns from those that must implement them. In a fashion similar to the myriad theories and models of change management and/or organizational behavior, it’s the individuals within the group that have to be effectively targeted.  Throw in a twist of technological adoption and the typical fear, uncertainty, and doubt (FUD) normally used in areas of security & that are often seen as “keeping people from doing their jobs” … you’ve got the perfect storm of things that are tough to change. Focusing then, on the intersecting issues of storage management, IA, and the optimization of information security investments within a framework of process re-engineering & adoption strategies borrowed from the TAM (Venkatesh, 2000, 2003, 2008; Morris, Davis, G., & Davis, F., 2003); I’ll be pulling from other research and courses I’ve developed to combine as a one-hour session.

At the very least, it did force me to take a look at my short bio … which I haven’t done in far too long.  While not fantastic, I think it still gets the point across & I’ve attached it here for my own entertainment. I’ll post more detail as this gets flushed out over the coming weeks, and as always I welcome any comments or input you have.

Short Bio – Levii Smith

//Levii

References

Albrechtsen, E. (2007). A qualitative study of users’ view on information security. Computers & Security, 26(4), 276–289. doi:10.1016/j.cose.2006.11.004

Takemura, T. (2011). Statistical Analysis on Relation between Workers’ Information Security Awareness and the Behaviors in Japan. Journal of Management Policy and Practice, 12(3), 27–37.

Venkatesh, V., Morris, M., Davis, G., & Davis, F. (2003). User Acceptance of Information Technology: Toward a Unified Veiw. MIS Quarterly, 27(3), 425–478.

 

 

Safely Moving Sensitive Data Outside of the Organization: Using Outlook to Encrypt to External Entities

Sensitive information, whether: PKIMail-Simplified

  • FOUO documents being handled for the government
  • PII during hiring and staffing
  • Proprietary or confidential company information
  • Competition sensitive intelligence
  • Or any other protection required data (PRD)

Require special use and handling of digital communications.  Unfortunately, a lack of education and familiarity with available methods leads to (1) data either being exchanged using insecure means, or (2) a breakdown in communication.

Sharing this information without protection has significant implications under both criminal and civil law, as I’ve discussed before relating to issues of negligent entrustment.  Even so, it is quite often an area of Information Assurance (IA) delegated to the lowest levels to implement and follow from a policy perspective.  Given that breach could be a public-relations nightmare, and that statutory liability could be financially destructive to a firm; it’s essential that data is protected at rest, and while in transit.

Truthfully, among the methods available for dealing with risk (i.e. insurance, transference, mitigation, avoidance), the simplest in this regard is avoidance.  “Just don’t send PRD”.  However, a knee-jerk reaction not share sensitive data can be problematic to the operation of many organizational programs, and an intentional breakdown of the ability to engage in transparent operation is not only inefficient, but can signal larger communication issues likely to exist.

Too frequently, I’ve encountered the assumption that encrypted mail cannot be exchanged between unrelated organizations, though it really is a simple process that seems extraordinary. I think this is due in part to the lack of education we provide as part of corporate/government/etc. cyber-awareness programs, but is certainly due to the fact that the first step in the process (the public key exchange) is typically automated and transparent.  Since the advent of Kerberos, however, virtually every directory structure with a modern authentication uses some form of certificate authority (CA), and having neither PKI certificates available within an organization, or certificates signed by an external CA (eCA) borders on negligence.

This paper provides a short guide to address this gap, and assumes one of the most common scenarios within the enterprise; the use of a public key infrastructure and Microsoft Outlook. Though certainly relevant to contractors and the DoD, it remains applicable anywhere that meets these two criteria.

View Fullscreen
 

Strategic Generalization – Staffing and Training Efficiently

Strategic Generalization – Staffing and Training Efficiently

Don’t discount generalization. Build a strategy of integration from multiple supporting specialties.

I began writing an introduction and synopsis of my academic and professional career to date, which was initially intended to be a single document and page on Levii.com in the effort of filling some gaps in information and completing a profile of my personal brand. Doing so carried a secondary goal of sharing my perception and insight of what I’ve observed to be a typical path from a young & “geeky IT weenie”, through the disciplines of technology, and into the application of that background into the various domains of business.

Of course there are differing approaches, skills and background needed for strategic planning, business development, management, negotiation, etc. that are required to grow beyond “simply IT”, and within IT there are often very good reasons not to generalize beyond a specific discipline.  IT is, however, foundational to the modern enterprise, and for IT professionals planning options for growth; a pragmatic perspective of the art of the possible, and appropriate places for technology to support the business plan, are essential skills to learn.  These skills necessarily require a degree of generalization outside of technology, and provide opportunistic basis to strategically develop multiple specializations over time.

I would, by no means, marginalize the achievements and value that specialists afford in every industry, and in virtually every organization that’s grown beyond a small(ish) size. While I unfortunately can’t provide empirical evidence or models of staffing profiles to generate an optimum balance of specialists to generalists; I can generalize from observation, provide some stories and cases of interest, and infer anecdotal evidence from successes and failures covering 15+ years of review.  My background, and therefore this discussion, focuses on the value of strategic generalization.

Pigeonholing Specialists, and Discounting Generalists

For whatever reason, the role of the generalist has developed into the false dichotomy between specialization in a targeted domain, and the application of critical thought and skills across a broad spectrum of problems. Even a google search for related terms returns virtually no discussion of the concept, and debate and perceptions that these must be either-or is a logically fallacious non sequitur.

Before I’m accused of arguing against fallacy with a different fallacious argument (presumably this one), I encourage anyone interested to consider the great minds of the ages.  The concept of the learned gentleman certainly fell out of popularity with the aristocracy, and it’s understood that many fields require a lifetime to master.  Admittedly, the notion that a single person could know all that was known in their chosen field, while also having a more than a hobbyist’s knowledge of the arts, philosophy, math, sciences, etc. is nonsensical in the modern age of knowledge creation.

Even so, the polymaths of the renaissance aren’t unheard of, and a similar capacity for ingenious connections and creation isn’t impossible to achieve today.  I’d argue that it is instead something that should be encouraged, and for anyone capable and willing to put the necessary time and effort into achieving that goal; it can be a rewarding experience.

While reviewing the lists of lessons learned, brainstorms on current research, and trying to develop roadmaps for technologists to identify where some of the most effective opportunities and areas might be; I found it necessary to further define the scope and nature of organizational structure and behavior to chart this better for those I mentor.  Since I place a focus and challenge each of them to develop a broader set of skills and to become generation of leaders to follow, I had to address what are near-constant reminders on how “computer people” are perceived.  How then to address the empirical argument that it’s far more common to move from business into IT management than it is to successfully move from IT into business management?

As many times as I’ve seen this trend, and through participation in numerous discussions and panels that fielded questions about roles; it remains clear to me that it’s far more effective to have a smaller, but highly talented workforce.  Projects and programs with personnel that are multifaceted with a broad spectrum of subject intelligence and skills tend to complement each other, develop specialty where none existed before, and reach out when necessary to request the support of true experts within a specialty.

It’s an odd finding then, that while useful and valuable when overall firm success is used as a measure; the increased horizontal perspective and specialist level-of-depth in select areas aren’t necessarily seen as valuable, and often lead to perceptions of over-qualification.  It should be realized that this nature of favoritism and need to pigeonhole expertise leads to larger and more narrow focus, decreased innovation in process and product generation, and an organization with a footprint that is greater in cost than is ordinarily necessary.

The rationale isn’t too hard to find often coming down to the overlapping areas and relevancy of the generalist’s expertise. With pattern matching systems used for HR databases, a generalist is easily overlooked, and given the limited space and non-descriptive task-based requisition; it’s difficult for an organization to see where a role might emerge that combines multiple positions into some functional subgroup that numbers less, but brings more to the table.  Of course there’s a human factor involved on both sides, in forming a recruitment & staffing process that takes abstract generalities and redefinition of need into account, and in a candidates ability to “sell” themselves into a role, rather than a job.

Focus on the need, not the specific skill

Consider technology, where roles have emerged that may or may not be associated with a specific job title or skill. It is often the role of the architects or the abstractionists to tie competing concepts together into a viable solution to the problem at hand.  In more cases than not, this requires expertise and experience spanning multiple domains which were developed as a grouping of specializations over time.  I’d argue that this is a specialization in its own right, and often one that’s expensive to acquire (for both the entity needing the skill, and for the person providing it).  While the currency of labor at the lowest-level of implementation tends to be lost; those are often commoditized areas of positions requiring “butts in seats” best classified as jobs, and not positions, where what is provided is labor, and not the creation of value.

Of course, not all organizations, thinkers, or leaders fall into this either-or trap; but it remains the language by which it is typically discussed.  There are a number of arguments that can be made against specialization, and supporting models for when it makes sense.  For my money, I’ll take a team of T-Shaped people that have access to expert specialists, as necessary, any day.

To sum this whole misconception up, popular sayings on tools can be readily applied to skills. In both cases they overly simplistic, and incomplete.

  • Use the right tool for the job.
  • The more tools in your toolbox, the more options you’ve got.

What should be added to the discussion, and find its way back into the mindset of leaders, professionals and practitioners, is the value afforded when we bridge the gap between commodities and specialties.  We need to realize that it’s not an either-or situation, and introspectively look across our organizations to determine roles and capabilities to fill the needs in meeting overall objectives rather than the jobs that fill up available time by creating work over value.

On specialties

  • If you’re missing tool that will be used frequently, consider getting it.
  • If you don’t have a tool that is difficult to borrow when you need it, consider getting it yourself.
  • If a tool is specialized, expensive, or won’t be used frequently enough to be worthwhile getting yourself; work with the person that already has it.

On commodities

  • If the tool is inexpensive, infrequently used, but common enough that it’s easy to borrow, work with the person that already has it.

As with anything else there’s a cost/benefit associated with each scenario.   When you need something frequently, absorbing the cost isn’t difficult to rationalize as a necessity; and occasionally having a tool is important for no reason other than the difficulty or cost of finding it in the time constraints it is needed.

Whether specialization in staffing is effective, or if a truly generic workforce will suffice … how many varieties and combinations of all-in-one tools are sensible, or whether outsourcing of labor (commoditized, specialized, or otherwise) is the more effective scenario; and to which degree the mix is optimal remain unique to each case.

To evaluate some of these, the rest of this series will focus on some instances from my own background, combined with a smattering of research and retrospective analysis; where abstract and special reasoning, combined with both breadth and depth of knowledge really are their own unique tool in the toolbox.

As always, I look forward to a continued discussion and any insight or commentary you have in any of the forums you find me.

//Levii

Additional References & Related Links:

 
Leave a comment

Posted by on January 25, 2014 in Business, General Informaion

 

Extending Requirements Management for Business Planning

I’ve used SysML to model requirements and traceability for impact analysis, coverage, etc. for a fair amount of time; and across more domains than the typical use of CASE tools.  I’ve found them to be useful as methods to model governance and compliance requirements from regulatory, contract, and corporate policy, and have even used requirements models to model my own priorities as an easy load into traditional project management software (e.g. MS Project).  To date my applications of choice have always been based on the Eclipse Requirements Modeling Framework (RMF) as incorporated in the TopCased Model Based Systems Engineering (MBSE) tool-chain.  This project is now being migrated to PolarSys, though the requirements part appears to have been incorporated as a standalone component by ProR.

In traditional Microsoft fashion, a good set of tools based on Collaborative Systems Requirements Modelling Language (CSRML), A goal-oriented markup for requirements modelling, has recently come to my attention.

If there’s anything that you’d need a formal requirements model for, but without the overhead of a full MBSE approach – it’s certainly worth looking into.  Questions regarding the point of modeling non-engineering activities are relatively predictable, and I’d like to address a couple of the more common asked by those represented in the Venn diagram below.

Actors in Enterprise Compliance

 

“What’s wrong with traditional BPM, CASE, RM, or other IDE type of tools?”

Unfortunately, Papyrus, TopCased, and many other similar applications are (a) less than user friendly for those without a background and understanding in UML (b) limited in scope of representation and access for collaboration.  From that perspective I wanted to think of alternative approaches, and the explosion in collaborative IDEs for software development leaves me hoping for a similar capability for modeling.

There are certainly no shortage of collaborative applications, and I’ll follow with some other shortcomings in the current SaaS market in a separate post. A prime example, however, of a promising entry are the IDEs that can provide tangible value to teams such as Cloud9 (use Chrome or FF).  This SaaS modeled IDE provides real-time visibility between members working on a project and is an incredibly efficient tool I use to work with remote developers when performing interviews & skills assessments.  While I’m of a firm belief that nothing quite replaces a face-to-face interview, being able to work through a real-world project allows me insight into the thought processes, and fit into the team dynamic they are being interviewed for … saving on travel costs, or the expense of a poor hiring decision.  Alternately something as simple as providing training, guidance, and leadership support to a remote employee who needs assistance with a concept is facilitated through such tools. Of course we’ve got many options for this, from simple screen-sharing and chat applications to full office suites such as: Office365, Google Docs, or Office 2010 (when backed by SharePoint) which all support real-time, multi-user, collaborative editors.  Extending the IDE as an asset into this domain solves two intermediate problems to having our requirements & models generate value for the organization.

  1. Since, enterprise architecture (EA) is often seen by personnel to be an esoteric representation of the organization and its processes.  Without open-access and visibility, ease of manipulation & methods of sharing EA data … it often ends up gathering dust in the corner, instead of being used to drive change. By enabling a collaborative and data-driven view of the enterprise (whether it be business process or IT architectures), the uses of data are opened up to creative possibilities & transparency can assist in the maintenance of information – preserving the original investment in investigation.
  2. Addressing requirements and developing plans for compliance within the enterprise is noble, but is only touching on the underlying need for traceability. With a more collaborative and diverse audience, the ability to trace requirements, their precedence & the source of those while mapping them against each other would only serve to bring a degree of transparency into the areas of Governance, Compliance & Risk (the subject of my next article).

“Why do I need to model my compliance needs? What’s wrong with using ABC?”

Some might accuse me of overcomplicating what is, on the surface, a relatively straightforward activity & question along the lines of “What are our compliance requirements, what policies & processes do we have to cover them, and how do we ensure our due diligence in validating adherence to those policies?”.  These might include topics in subcontractor management, ANSI EVM controls, IA Control requirements for PCI, or any other myriad set of complementary frameworks. In the typical firm, however, the necessary perspective to answer this line of questioning is rarely found in one person, or group, and without collaborative ideation … it’s incredibly likely that either (a) something will be missed or (b) that information isn’t effectively communicated to personnel.  In either instance, a failure of due diligence may leave a firm at risk.

Often I’ll see this documented as spreadsheets with a matrix of tabs, but this fails the traceability test & doesn’t lend itself to automation in case of audit.  Furthermore, it fails to offer extensibility at scale, doesn’t afford the organization an efficient method of communicating the requirements to program/project managers or their staff, dramatically increases overhead required to train and validate program execution, and increases the amount of rework or discovery needed by personnel who have a need of that data.

Blame it on my firm belief in the value of quantitative measurement and management as not only a “best practice”, but as a “required practice” for high-performing teams & organizations; or perhaps, my firm belief in transparent leadership and governance.  Enough people have heard my argument against the use of spreadsheets as a form of enterprise knowledge management, to understand my perspective as to why this is a bad idea & why the way the typical firm handles this today needs something different. That something needs to be easy-to-use for the small to mid-sized firm, needs to be able to tie into ERP and audit systems, and needs to be a collaborative method treating EA & Governance as reusable, transparent, and executable enterprise assets.

Though I say it jokingly to my colleagues, there is truth when I note that I’ll need to consider an MBA after I complete my PhD, and in all likelihood should follow that with a JD … there’s a niche opportunity for systems, and people that have a deep understanding between areas of business, law & technology.

“So what can we do about it?”

While I’m often accused of having big ideas … being a person capable of developing solutions to most of them is a core strength I rely on to demonstrate value through prototypes.  To that end, and based on my research/papers on the overall extent to which regulatory, contract, and corporate governance can cross organizational disciplines; I’ve begun modeling a SaaS solution for publication control mapping and compliance methods.  I’m through the generic model of compliance, and have a number of concepts on how to carry it forward, but am leaning towards a community-driven site and application that could tie to my parallel efforts in using predictive analytics for effort & cost estimation.  As each gets a bit more polished I’ll publish them under an open license; but in the meantime I certainly appreciate any insight, critique, or commentary on the subject.

So jump on in and share your thoughts.  Do you see this as a problem, or is it a niche need? How do you manage the myriad set of policies, procedures, and alignment to the governance frameworks in your industry? Any particular tools that you use, and how do you ensure that those processes are executable by staff? Do you have an orchestrated set of executable workflows, or is it tribal knowledge and lots of training (for hopefully compliant programs)?

 
Leave a comment

Posted by on January 5, 2014 in Business, Information Technology

 

Tags: , , , , , , , , ,

DoD Contracts and EVM Requirements: Worth a Second Look for Program Managers

I’ve written somewhat frequently on the benefits of quantitative management of IT programs, and specifically on some of the problems faced by defense contractors in aligning industry best-practice to ANSI-748 EVM control requirements.  I’m certainly not the only voice that’s advocated for reform on the acquisitions side of the house, and there are more than a few good perspectives in: alignment with Agile practices, Earned Value for Business Value, CMMI support for EVM, and the various tools and processes to make it all work together.  I’d therefore be remiss, if I didn’t open this post up by giving credit where it’s due. My perspective in this domain has been heavily influenced by Dale G., Glen B. Alleman, Paul J. Solomon, all the contributors on the StackExchange network, and a significant number of papers in both the academic and practitioner literature.  For anyone interested, I’ll post my reference library as a BibTex entry at a later date, or feel free to contact me on any site I frequent and I’ll package up the pertinent parts to send you.

The issue of compliance has been particularly troublesome for those projects that are effectively fixed in terms of cost (i.e. the contract award value) and resources (i.e. the bid LoE supporting the effort), while the scope of any particular work package tends to be flexible (e.g. sustainment, maintenance, or Agile development). Though I’ve argued that the DMCA guidance and intent statement in conjunction with the ANSI specification itself both allow for the use of Technical Performance Measures (TPMs) as a method of tracking progress-to-plan; the assignment of value to technical tasks, construction of the Integrated Master Schedule (IMS), and development of the Work Breakdown Structure (WBS) is often constrained by the Statement of Work (SoW)/Performance Work Statement (PWS), or acquisition (AQ) process.

Thankfully, a footnote and minor change to the November 26, 2013 reissuance of DoD 5000.02, streamlines a lot of this effort, particularly for those programs under the $50M threshold. Though this particular language gives 180 days to redevelop the guidance, the interim instruction appears specifically intended to better facilitate the acquisitions process, and very well could be read to open the TPM method of metric utilization by aligning more to the Systems Engineering sub-processes.

DoD 5000.2 EVM Requirement - November 26, 2013

This change reinvigorates my original argument, and while the DCMA has allowed this type of cost-allocation to technical measures for some time; acquisitions (AQ) have been constrained in the application.  Realizing that many smaller programs bore an undue level of overhead generating financial metrics that didn’t necessarily trace back to the value being derived, and that numerous projects using EVM in a strict application to comply with AQ policy have failed provide effective measures for program managers, it seems that we’re finally getting there.  The option for AQ to define EV on technical measures can be used to tightly align industry best-practices (e.g. Agile, Kanbam, CMMI, ITIL, or ISO processes) back to the EVM standard, and enable us to price contracts more competitively.  Without the additional overhead and financial wizardry necessary to get a non-EVM program to report as if it were; and with the ability to “manage by the numbers” it opens an avenue for much more productive programs, and reduction in taxpayer funded waste.

The biggest surprise, however, lies in what’s missing.  A close look at the EVM requirements themselves, as outlined below shows that task orders above $20M but below $50M must use a system that’s compliant with the EVM guidelines (and there for the optional use of TPM), but not a formally validated EVMS.

For the near-future, it would certainly appear that smaller projects, and particularly those supporting more fluid requirements that are realistically best-described as Level of Effort and/or Time and Materials (T&M) types of contracts, have had the noose loosened a notch.  I’d certainly advise any PM in the industry that’s under this requirement to take a closer look … you might find that you’ve got the opportunity to improve not only your reporting, but the processes themselves to gain efficiency and analytic intelligence for continuous improvement.

As with everything else on this site, I hope to hear your thoughts. Contact me, or leave your comments below & I’ll do my best to get back to you.

//Levii

 
Leave a comment

Posted by on December 14, 2013 in Business

 

Tags: ,

Easily Remember Passwords with Keyphrases

After a week away from the office over Thanksgiving, a colleague had understandably forgotten their convoluted non-dictionary, non-repeating, non-sequential, non-patterned, mixed case, alphanumeric & special, 15 character password. Give it a week or two and who wouldn’t?

As they were trying to think of a memorable password that was strong enough password to meet policy, I thought of an article I’d published circa 2006, and thought it was worth putting back out. It’s hard to believe it was so long ago, but that aside, it’s not a new subject and was the focus of one of my favorite XKCD comics, on slashdot in a discussion of alternate password systems, and over at Scheier on Security as a perspective on the secret question.

Keyphrase Chart Though I initially visited the subject before the ubiquity of smart devices, when I was more involved in the day-to-day system and network administration of organizations, it still seemed relevant as I was considering it today. As many people aren’t using systems like KeePass, can’t carry personally owned internet devices into their workplace, or simply aren’t comfortable with truly random passwords, this article represents the same approach, redone in a different code base simply to demonstrate the concept. Truthfully though, it fairly simple and reflects a mix of two of the oldest methods of concealing messages: the substitution and the book ciphers. The idea was, and remains, to allow users a straightforward tool to use short phrases that are easily memorable, and to use those as a key for more complex passwords, and to make cycling passwords as simple as grabbing a new wallet-sized keyphrase card. In the original incarnation, I printed two (2) of each card so people could keep a copy in a safe location, and carry the other with them. This allows strong passwords on the system, while keeping it easy to remember.

The approach is a relatively straightforward substitution cipher benefited by a randomized card that’s unique to each user. Carry the card (and effectively your passwords) with you, stick it under a keyboard, tape it to your monitor, etc. All you need remember is the simple pass-phrase. The same key can be continuously used, so the word “WORK” becomes an eight letter alphanumeric & special character password, maintaining a reasonable degree of security. Since the actual key is easily remembered, stick cards for different purposes (i.e. work vs. home) at different locations, or keep a backup copy of the card left in an accessible location … without significant risk of password compromise. Furthermore, regular password changes and more likely since the key itself doesn’t need to be regenerated, and there’s less concern about forgetting a bunch of new passwords.

There are certainly changes I’d make to this in a production environment; I’d imagine that plastic cards, having a unique chart printed on the back of a business card, or having a digital version would be an improvement over the stack of printed cards I’d used before. I’m sharing this as I rework some of my whitepapers and other concepts into what is hopefully useful content, and to contextualize them to spark discussion. With the state of existing systems, it’s an idea worth further evaluation; as a potential enhancement, or alternative, to the secret question, the site seals that are growing in popularity, or the keypad entry points that aren’t used near frequently enough. Without threat modeling and algorithmic analysis, the biggest concern I’d focus on would be shoulder-surfing as a means to learn the key, and the fact that all passwords are the same length; which might simplify crypanalysis. Both of which, are however, relatively straightforward changes to the design.

The output is here for anyone interested in the method, and for the sake of completeness the source used to generate the output is posted below. As always I welcome any discussion and feedback on applications, how you manage the overhead of password complexity and human memory, or anything else you might feel like throwing my way.

 
 
%d bloggers like this: