RSS

Author Archives: Levii Smith

Strategic Generalization – Staffing and Training Efficiently

Strategic Generalization – Staffing and Training Efficiently

Don’t discount generalization. Build a strategy of integration from multiple supporting specialties.

I began writing an introduction and synopsis of my academic and professional career to date, which was initially intended to be a single document and page on Levii.com in the effort of filling some gaps in information and completing a profile of my personal brand. Doing so carried a secondary goal of sharing my perception and insight of what I’ve observed to be a typical path from a young & “geeky IT weenie”, through the disciplines of technology, and into the application of that background into the various domains of business.

Of course there are differing approaches, skills and background needed for strategic planning, business development, management, negotiation, etc. that are required to grow beyond “simply IT”, and within IT there are often very good reasons not to generalize beyond a specific discipline.  IT is, however, foundational to the modern enterprise, and for IT professionals planning options for growth; a pragmatic perspective of the art of the possible, and appropriate places for technology to support the business plan, are essential skills to learn.  These skills necessarily require a degree of generalization outside of technology, and provide opportunistic basis to strategically develop multiple specializations over time.

I would, by no means, marginalize the achievements and value that specialists afford in every industry, and in virtually every organization that’s grown beyond a small(ish) size. While I unfortunately can’t provide empirical evidence or models of staffing profiles to generate an optimum balance of specialists to generalists; I can generalize from observation, provide some stories and cases of interest, and infer anecdotal evidence from successes and failures covering 15+ years of review.  My background, and therefore this discussion, focuses on the value of strategic generalization.

Pigeonholing Specialists, and Discounting Generalists

For whatever reason, the role of the generalist has developed into the false dichotomy between specialization in a targeted domain, and the application of critical thought and skills across a broad spectrum of problems. Even a google search for related terms returns virtually no discussion of the concept, and debate and perceptions that these must be either-or is a logically fallacious non sequitur.

Before I’m accused of arguing against fallacy with a different fallacious argument (presumably this one), I encourage anyone interested to consider the great minds of the ages.  The concept of the learned gentleman certainly fell out of popularity with the aristocracy, and it’s understood that many fields require a lifetime to master.  Admittedly, the notion that a single person could know all that was known in their chosen field, while also having a more than a hobbyist’s knowledge of the arts, philosophy, math, sciences, etc. is nonsensical in the modern age of knowledge creation.

Even so, the polymaths of the renaissance aren’t unheard of, and a similar capacity for ingenious connections and creation isn’t impossible to achieve today.  I’d argue that it is instead something that should be encouraged, and for anyone capable and willing to put the necessary time and effort into achieving that goal; it can be a rewarding experience.

While reviewing the lists of lessons learned, brainstorms on current research, and trying to develop roadmaps for technologists to identify where some of the most effective opportunities and areas might be; I found it necessary to further define the scope and nature of organizational structure and behavior to chart this better for those I mentor.  Since I place a focus and challenge each of them to develop a broader set of skills and to become generation of leaders to follow, I had to address what are near-constant reminders on how “computer people” are perceived.  How then to address the empirical argument that it’s far more common to move from business into IT management than it is to successfully move from IT into business management?

As many times as I’ve seen this trend, and through participation in numerous discussions and panels that fielded questions about roles; it remains clear to me that it’s far more effective to have a smaller, but highly talented workforce.  Projects and programs with personnel that are multifaceted with a broad spectrum of subject intelligence and skills tend to complement each other, develop specialty where none existed before, and reach out when necessary to request the support of true experts within a specialty.

It’s an odd finding then, that while useful and valuable when overall firm success is used as a measure; the increased horizontal perspective and specialist level-of-depth in select areas aren’t necessarily seen as valuable, and often lead to perceptions of over-qualification.  It should be realized that this nature of favoritism and need to pigeonhole expertise leads to larger and more narrow focus, decreased innovation in process and product generation, and an organization with a footprint that is greater in cost than is ordinarily necessary.

The rationale isn’t too hard to find often coming down to the overlapping areas and relevancy of the generalist’s expertise. With pattern matching systems used for HR databases, a generalist is easily overlooked, and given the limited space and non-descriptive task-based requisition; it’s difficult for an organization to see where a role might emerge that combines multiple positions into some functional subgroup that numbers less, but brings more to the table.  Of course there’s a human factor involved on both sides, in forming a recruitment & staffing process that takes abstract generalities and redefinition of need into account, and in a candidates ability to “sell” themselves into a role, rather than a job.

Focus on the need, not the specific skill

Consider technology, where roles have emerged that may or may not be associated with a specific job title or skill. It is often the role of the architects or the abstractionists to tie competing concepts together into a viable solution to the problem at hand.  In more cases than not, this requires expertise and experience spanning multiple domains which were developed as a grouping of specializations over time.  I’d argue that this is a specialization in its own right, and often one that’s expensive to acquire (for both the entity needing the skill, and for the person providing it).  While the currency of labor at the lowest-level of implementation tends to be lost; those are often commoditized areas of positions requiring “butts in seats” best classified as jobs, and not positions, where what is provided is labor, and not the creation of value.

Of course, not all organizations, thinkers, or leaders fall into this either-or trap; but it remains the language by which it is typically discussed.  There are a number of arguments that can be made against specialization, and supporting models for when it makes sense.  For my money, I’ll take a team of T-Shaped people that have access to expert specialists, as necessary, any day.

To sum this whole misconception up, popular sayings on tools can be readily applied to skills. In both cases they overly simplistic, and incomplete.

  • Use the right tool for the job.
  • The more tools in your toolbox, the more options you’ve got.

What should be added to the discussion, and find its way back into the mindset of leaders, professionals and practitioners, is the value afforded when we bridge the gap between commodities and specialties.  We need to realize that it’s not an either-or situation, and introspectively look across our organizations to determine roles and capabilities to fill the needs in meeting overall objectives rather than the jobs that fill up available time by creating work over value.

On specialties

  • If you’re missing tool that will be used frequently, consider getting it.
  • If you don’t have a tool that is difficult to borrow when you need it, consider getting it yourself.
  • If a tool is specialized, expensive, or won’t be used frequently enough to be worthwhile getting yourself; work with the person that already has it.

On commodities

  • If the tool is inexpensive, infrequently used, but common enough that it’s easy to borrow, work with the person that already has it.

As with anything else there’s a cost/benefit associated with each scenario.   When you need something frequently, absorbing the cost isn’t difficult to rationalize as a necessity; and occasionally having a tool is important for no reason other than the difficulty or cost of finding it in the time constraints it is needed.

Whether specialization in staffing is effective, or if a truly generic workforce will suffice … how many varieties and combinations of all-in-one tools are sensible, or whether outsourcing of labor (commoditized, specialized, or otherwise) is the more effective scenario; and to which degree the mix is optimal remain unique to each case.

To evaluate some of these, the rest of this series will focus on some instances from my own background, combined with a smattering of research and retrospective analysis; where abstract and special reasoning, combined with both breadth and depth of knowledge really are their own unique tool in the toolbox.

As always, I look forward to a continued discussion and any insight or commentary you have in any of the forums you find me.

//Levii

Additional References & Related Links:

 
Leave a comment

Posted by on January 25, 2014 in Business, General Informaion

 

Extending Requirements Management for Business Planning

I’ve used SysML to model requirements and traceability for impact analysis, coverage, etc. for a fair amount of time; and across more domains than the typical use of CASE tools.  I’ve found them to be useful as methods to model governance and compliance requirements from regulatory, contract, and corporate policy, and have even used requirements models to model my own priorities as an easy load into traditional project management software (e.g. MS Project).  To date my applications of choice have always been based on the Eclipse Requirements Modeling Framework (RMF) as incorporated in the TopCased Model Based Systems Engineering (MBSE) tool-chain.  This project is now being migrated to PolarSys, though the requirements part appears to have been incorporated as a standalone component by ProR.

In traditional Microsoft fashion, a good set of tools based on Collaborative Systems Requirements Modelling Language (CSRML), A goal-oriented markup for requirements modelling, has recently come to my attention.

If there’s anything that you’d need a formal requirements model for, but without the overhead of a full MBSE approach – it’s certainly worth looking into.  Questions regarding the point of modeling non-engineering activities are relatively predictable, and I’d like to address a couple of the more common asked by those represented in the Venn diagram below.

Actors in Enterprise Compliance

 

“What’s wrong with traditional BPM, CASE, RM, or other IDE type of tools?”

Unfortunately, Papyrus, TopCased, and many other similar applications are (a) less than user friendly for those without a background and understanding in UML (b) limited in scope of representation and access for collaboration.  From that perspective I wanted to think of alternative approaches, and the explosion in collaborative IDEs for software development leaves me hoping for a similar capability for modeling.

There are certainly no shortage of collaborative applications, and I’ll follow with some other shortcomings in the current SaaS market in a separate post. A prime example, however, of a promising entry are the IDEs that can provide tangible value to teams such as Cloud9 (use Chrome or FF).  This SaaS modeled IDE provides real-time visibility between members working on a project and is an incredibly efficient tool I use to work with remote developers when performing interviews & skills assessments.  While I’m of a firm belief that nothing quite replaces a face-to-face interview, being able to work through a real-world project allows me insight into the thought processes, and fit into the team dynamic they are being interviewed for … saving on travel costs, or the expense of a poor hiring decision.  Alternately something as simple as providing training, guidance, and leadership support to a remote employee who needs assistance with a concept is facilitated through such tools. Of course we’ve got many options for this, from simple screen-sharing and chat applications to full office suites such as: Office365, Google Docs, or Office 2010 (when backed by SharePoint) which all support real-time, multi-user, collaborative editors.  Extending the IDE as an asset into this domain solves two intermediate problems to having our requirements & models generate value for the organization.

  1. Since, enterprise architecture (EA) is often seen by personnel to be an esoteric representation of the organization and its processes.  Without open-access and visibility, ease of manipulation & methods of sharing EA data … it often ends up gathering dust in the corner, instead of being used to drive change. By enabling a collaborative and data-driven view of the enterprise (whether it be business process or IT architectures), the uses of data are opened up to creative possibilities & transparency can assist in the maintenance of information – preserving the original investment in investigation.
  2. Addressing requirements and developing plans for compliance within the enterprise is noble, but is only touching on the underlying need for traceability. With a more collaborative and diverse audience, the ability to trace requirements, their precedence & the source of those while mapping them against each other would only serve to bring a degree of transparency into the areas of Governance, Compliance & Risk (the subject of my next article).

“Why do I need to model my compliance needs? What’s wrong with using ABC?”

Some might accuse me of overcomplicating what is, on the surface, a relatively straightforward activity & question along the lines of “What are our compliance requirements, what policies & processes do we have to cover them, and how do we ensure our due diligence in validating adherence to those policies?”.  These might include topics in subcontractor management, ANSI EVM controls, IA Control requirements for PCI, or any other myriad set of complementary frameworks. In the typical firm, however, the necessary perspective to answer this line of questioning is rarely found in one person, or group, and without collaborative ideation … it’s incredibly likely that either (a) something will be missed or (b) that information isn’t effectively communicated to personnel.  In either instance, a failure of due diligence may leave a firm at risk.

Often I’ll see this documented as spreadsheets with a matrix of tabs, but this fails the traceability test & doesn’t lend itself to automation in case of audit.  Furthermore, it fails to offer extensibility at scale, doesn’t afford the organization an efficient method of communicating the requirements to program/project managers or their staff, dramatically increases overhead required to train and validate program execution, and increases the amount of rework or discovery needed by personnel who have a need of that data.

Blame it on my firm belief in the value of quantitative measurement and management as not only a “best practice”, but as a “required practice” for high-performing teams & organizations; or perhaps, my firm belief in transparent leadership and governance.  Enough people have heard my argument against the use of spreadsheets as a form of enterprise knowledge management, to understand my perspective as to why this is a bad idea & why the way the typical firm handles this today needs something different. That something needs to be easy-to-use for the small to mid-sized firm, needs to be able to tie into ERP and audit systems, and needs to be a collaborative method treating EA & Governance as reusable, transparent, and executable enterprise assets.

Though I say it jokingly to my colleagues, there is truth when I note that I’ll need to consider an MBA after I complete my PhD, and in all likelihood should follow that with a JD … there’s a niche opportunity for systems, and people that have a deep understanding between areas of business, law & technology.

“So what can we do about it?”

While I’m often accused of having big ideas … being a person capable of developing solutions to most of them is a core strength I rely on to demonstrate value through prototypes.  To that end, and based on my research/papers on the overall extent to which regulatory, contract, and corporate governance can cross organizational disciplines; I’ve begun modeling a SaaS solution for publication control mapping and compliance methods.  I’m through the generic model of compliance, and have a number of concepts on how to carry it forward, but am leaning towards a community-driven site and application that could tie to my parallel efforts in using predictive analytics for effort & cost estimation.  As each gets a bit more polished I’ll publish them under an open license; but in the meantime I certainly appreciate any insight, critique, or commentary on the subject.

So jump on in and share your thoughts.  Do you see this as a problem, or is it a niche need? How do you manage the myriad set of policies, procedures, and alignment to the governance frameworks in your industry? Any particular tools that you use, and how do you ensure that those processes are executable by staff? Do you have an orchestrated set of executable workflows, or is it tribal knowledge and lots of training (for hopefully compliant programs)?

 
Leave a comment

Posted by on January 5, 2014 in Business, Information Technology

 

Tags: , , , , , , , , ,

CNAP – Semester 1 – Chapters 10-15

(A Story of “How Levii met Tara”)

The fifth and final of a the Semester 1 training series of material. I’d originally developed this material late at night … or the wee hours of the morning, depending on your perspective. All four Semesters were originally developed while living in Abilene, TX; with later revisions after I’d moved to: San Antonio, TX; O’Fallon, IL; and Sembach, DE. The last update was made around 2005-2006 as a final handoff of instructional material to the 21st OWS systems flight to help prepare them for the CCNA examination, and as part of an after-hours DoD 8570 study group I formed for the IT/IS airmen in the area.

View Fullscreen

Much of the work that went into developing a curriculum, summarizing data for these slides, and spending countless hours with stacks of books from Cisco Press could certainly be considered preparatory to other courses I’ve taught, any number of papers that I eventually did write, and a foundation to my self-identity as a person that enjoys sharing their knowledge; that would miss a key outcome. Those that know me have likely heard the story on how one of my earliest managers (and current colleague BTW) “conned” me into moving to Abilene. It was the light nights studying and working on this material, however, that introduced me to meet Tara. She took an interest in this strange young guy that showed up to IHOP every night with a laptop and stack of books, and who would eat his dinner with a gallon of coffee. So while I certainly hope this material finds a use to someone else, the opportunity to go back through it reinvigorates the memories of that first Christmas away from home, working on these slides, and the period that I was making a new friend; who years later … would become my wife.

This presentation covers the concepts of routing and addressing. Likely one of the most difficult areas for many people to initially grasp, sub-netting/CIDR were the subject of a few hours practice and working through problems as a group. I would typically run through three different methods of solving the problem, with increasing levels of decomposition and explanatory description back to the other methods. It’s my experience that subjects like this, as with many others that involve: formulas, solutions, proofs, sequential problem breakdown, etc., that could be solved multiple ways are often picked up quickly by the first 10% of a population, and they tend to be comfortable with multiple methods. The next 65% or so of the students will pick up given a second round of detailed explanation, but this group (and the final 25%) can be thrown back “off track” when demonstrating alternate methods, or further decomposition of the problem. It’s at this point then, that once a student ad the “ah ha!” moment, I would typically advise a student to excuse themselves from the class for remainder of the day, and give the sample problems a practice run a few hours later.

Reference Files:

The assessments and practical exam were delivered at the end of the Semester and are available below for reference.

 
Leave a comment

Posted by on December 20, 2013 in Documents & Applications

 

CNAP – Semester 1 – Chapters 8 & 9

The fourth part of the inter-networking training material provides basic information on infrastructure and cabling standards.

View Fullscreen

Reference Files:

The assessments and practical exam were delivered at the end of the Semester and are available below for reference.

 
Leave a comment

Posted by on December 20, 2013 in Documents & Applications

 

CNAP – Semester 1 – Chapters 5-7

The third installment of the material developed, this presentation focuses on: media, network topologies, and introduces MAC addressing with basic Layer 2 & 3 functional knowledge.

View Fullscreen

Reference Files:

The assessments and practical exam were delivered at the end of the Semester and are available below for reference.

 
Leave a comment

Posted by on December 20, 2013 in Documents & Applications

 

CNAP – Semester 1 – Chapters 3 & 4

The second of five presentations used for background for the training programs I ran until around 2005. This deck provides basic information about LAN devices, network topologies, and the basics of electrical theory used for inter-networking communications.

View Fullscreen

Reference Files:

The assessments and practical exam were delivered at the end of the Semester and are available below for reference.

 
Leave a comment

Posted by on December 20, 2013 in Documents & Applications

 

CNAP – Semester 1 – Chapters 1 & 2

These are the first chapters of the Semester 1 slides I developed for use in the Structured on the Job (SOJT) training program for the Air Force (AF) Cisco Networking Academy Program (CNAP), along with the background lab configuration and practical exam I required for course completion. The courses were taught over a 10 day period of four (4) hours per day.  The individual lab assignments, some of the graphics, and the reading material are (C) Cisco Systems 2003 and are utilized under the fair-use sections of Copyright for training and education purposes.  Feel free to use any material on this site, and attribution back is greatly appreciated.

View Fullscreen

Reference Files:

The assessments and practical exam were delivered at the end of the Semester and are available below for reference.

 
Leave a comment

Posted by on December 20, 2013 in Documents & Applications

 
 
%d bloggers like this: