RSS

Category Archives: Information Technology

Merging Word documents … What a pain

Merging Word documents … What a pain

For the umpteenth time in my career I had a document in MS Word (2010 in this case) that was reviewed by multiple individuals (about 8) with changes to formatting, content, etc. along with reviewer comments. These then had to be merged back into a single document, which is certainly easier than it used to be – but is still incredibly inefficient. Let’s look at the normal flow:

  1. Document Created (O1)
  2. Send to Review (email or ??)
  3. Receive n copies of the document back (R1 – Rn)
  4. Make a copy of the original doc (O2) and combine with the revised copy R1 [in the original document]
  5. Save O2 and close both files (or all of the merge panes).
  6. Select the combine option again, find O2 and merge in R2.
  7. Rinse and repeat steps 5 & 6 through Rn.

Certainly using the version tracking of SharePoint 2010/2013 would be considerably easier while also allowing for concurrent editing, but that’s not always an option. Thinking of a large organization where this is done many hundreds of times each day, by thousands of employees, I was surprised that a small utility to perform the function of “merge these N files with O1” didn’t pop up on a quick Google/StackOverflow search. In my one instance, I missed a file … or messed something up the first time through which killed about 20 minutes of my day that could have been otherwise productive by actually addressing the commentary. To that end I decided to do something about it which I’ll share in source once I’ve gone through it enough that it wouldn’t be embarrassing.  It’s gotten a little bit cleaner, and a whole lot faster – but since it’s based on InterOp it’s a temporary solution until the features is fully implemented in Eric White’s EXCELLENT OpenXmlPowerTools library (link to the issue).

For the time being we’ll just call this WordMerge, and runs as a standalone executable. Not a heap of validation and error checking in place at the moment (e.g. bad output path could be typed in which will throw a general exception), but it has been through the paces and works pretty well up to the 50 documents I tried. For my use, I’m perfectly happy with how it runs – but if anyone else grabs the concept before I get code out there; I’m open to suggestions (a SharePoint extension, Office Plugin, and a WPF version are already planned BTW).

You can see the interface in the screenshot below … not much to explain.

wordmerge screenshot

Certainly, it isn’t foolproof to merge everything in at once. If a person moved a lot of content, and other reviewers modified that content; MS Word would be unaware of the “right” order to resolve conflicts. In these cases there may be straggling words that weren’t part of the move (esp if the move was processed after the changes in the merge order), which I’ve tried to address by allowing a re-order in the merged documents list.  That notwithstanding, thankfully the ability to toggle reviewers and types of changes in the review pane makes reconciling a much simpler process.

I’d be interested in how you normally handle disparate review documents … one at a time and copy what you’d like to keep over, merge them all by hand and accept/reject change, or something else entirely?  I’ll pull thoughts into the merger and possibly tie it with some other work I’m contemplating that uses NLP parsing systems in conjunction with context-free grammar generation to assist edits and rewrites from a Knowledge Management repository.  That’s definitely further out on the horizon though.  Next up is a syntax highlighter for Word – simply because I’m tired of the wasted time and inconsistent formatting in software documentation that inevitably results if you’ve got to embed source.  If there are suggestions for that before I get it released, feel free to send them my way.

Grab it, use it, save yourself some headache. If you’ve got feedback for me, let me know.

//Levii

 

Qiqqa Review

Been pretty impressed with the current generation of Reference management software. Looking at Qiqqa now. Premium looks slick. I’ll come back for a further review, but if you want to check it out before I get to that point, use this link and you’ll get me a free month of premium, and two weeks free for yourself. Don’t worry, no credit-cards or anything like that are required; nor does the free version cost a thing. Probably worth looking into if you’re just now starting to get into reference management solutions.

The Link: http://www.qiqqa.com/Register/106431

//Levii

 
Leave a comment

Posted by on July 17, 2013 in Information Technology

 

Tags:

USAFE Public Weather Site (21OWS)

Though it’s been a long time since I was there, it’s good to know that my work has outlasted me. If you’re looking for European navigation or aeroweather products one of the best sources is still the 21st Operational Weather Squadron. Actually considering that I left in 2008, the fact that the site design is still intact is actually amazing to me.  While they’re no longer on Sembach, that’s okay the domain lives on.

http://ows.public.sembach.af.mil/

Even if you’re not remotely interested in the weather, the tools are an interesting lesson in my javascript history … oh the days before jQuery and browser fights – how I do not miss you.

http://ows.public.sembach.af.mil/index.cfm?section=Tools

//Levii

 

Tags: ,

The Economics and Risks of Electronic Employee Monitoring

Employees and other inside personnel pose among the largest risks to an organization.  The overall profile of risk exists across a continuum that includes losses based on employee productivity, breach of confidentiality, data integrity, and overall availability of organizational resources.  It is therefore essential that a firm develop and maintain an effective posture for mitigating these risks that are both compliant with statutory requirements and limitations, while balancing the rights of employees to maintain a modicum of privacy in and out of the workplace.

 
Leave a comment

Posted by on February 1, 2013 in Business, Information Technology

 

Tags:

Outsourced Data Handling: Brief Thoughts of Negligent Entrustment Applicability

I find this subject area to be among the most interesting and fluid in IT today, as the area where law meets technology is still being formed in the US. Our system of common law combined with the speed of our legislative process (or lack thereof) compared to that of technological innovation leaves gaps in legal findings that are being tried on a regular basis to develop judicial precedence. Precedence then being the interpretation of a tort, as compared to what a judge may find similar in other cases when combined with the doctrine of Stare Decisis (until opposing legislation is enacted) covers these subjects more than the actual state of the congressional acts or agency regulation. It is an incredibly complicated subject where many educated in law are not technologists, and many technologists have an inadequate education in business law.

Negligent entrustment is covered under state civil codes within the personal injury set of torts (Kionka, 1999). What Rustad and Koenig(2007) detail are what might someday become issues of liability to a US company, not necessarily those that are currently primary liability concerns. It is still an interesting thought exercise and set of items that must be evaluated to form an effective risk profile of outsourcing activities. Also, since an individual is entitled to sue for anything the costs for an organization to protect itself from suit, even in those cases where it is settled or on, could be substantial. In order to reduce these eventualities the due care and due diligence of audit and contract enforcement, validation of contract performance measurements, and adherence to the law of the land the data originates, must be among the foremost concerns of the CIO, CISO and compliance officer. The ways in which courts rule on data in the coming years have the potential to affect the IT and software industry in the US. A finding in favor of suit for negligent entrustment in outsourced data would set dangerous precedent, deeming the data itself a dangerous item. This concept could then easily bleed into many other areas to include software development liability (which is cause for another paper entirely). While there have been attempts to prosecute authors of hacking tools with criminal offenses, to date, it has been upheld as activity covered under the 1st Amendment – but should that protection fall, all users and creators of data would have significant potential liabilities foisted upon them

As pointed out by Rustad and Koenig (2007) the incidence of lawsuits brought for negligent security practices is on the rise, but all cases that have resulted in award have been the result of direct liability. In the US, courts based on the research conducted and literature reviewed, there have yet to be any cases of indirect liability or negligent entrustment decided since negligence itself is specific to failure of due care (in absence of strict liability statutes) and there basis for negligent entrustment when the instrument itself is not directly capable of causing harm (Kionka, 1999). Additionally, when there is a superseding cause that would not have been reasonably foreseeable there would not be an issue of liability. If however, the outsourcing operations fail to include the due diligence and due care of a reasonable man, then the superseding cause argument would fail, and liability would revert to the company as the tortfeasor since reasonable expectation of data breach would likely meet the requirements for a proximate cause liability claim (Clarkson, Miller & Jentz, 2003).

Data breaches are inevitable (Huang, Hu & Behara, 2008), and the California Appellate court found that this claim of inevitability on the part of the claimant, or that evidence of that inevitability could be used to show that the negligence of the defendant, if any, is not a proximate cause on the part of the defendant (Smith v. San Francisco, 1953). While it remains essential to protect data under direct liability scenarios where a failure to exercise due care can be actionable, imagine a world where this was not the case, or where the ability to exclude warranty of merchantability and/or suitability for purpose could not be accomplished through contract. Oracle Corporation might be held liable because their software held a vulnerability that was exploited by a hacker attacking a bank, or Microsoft could be sued if a workstation crashed and lost some personal data. As highlighted by Ferrera, Lichtenstein, Reder, Bird and Schiano (2004) both of these situations can be shown to have actual damages, though the effect of allowing such inevitable actions to fall back to the progenitor of the system would have chilling repercussions for all transactions and systems in the digital world.

 

References

Ferrera, L. R. (2004). CyberLaw, Text and Cases 2nd Edition. Thomson Corporation.

Huang, C., Hu, Q., & Behara, R. S. (2008). An economic analysis of the optimal information security investment in the case of a risk-averse firm. International Journal of Production Economics, 114(2), 793–804. doi:10.1016/j.ijpe.2008.04.002

Kionka, Edward J. 1999. Torts in a Nutshell. 3d ed. St. Paul, Minn.: West Group

Rustad, M. L., & Koenig, T. H. (2007). Negligent entrustment liability for outsourced data. Journal of Internet Law, 10(10), 3–6. Retrieved from http://web.ebscohost.com.library.capella.edu/ehost/detail?sid=e04a605f-8b74-40ec-8586-a33effec288c@sessionmgr115&vid=1&hid=112&bdata=JnNpdGU9ZWhvc3QtbGl2ZSZzY29wZT1zaXRl#db=bth&AN=24619583

Smith v. San Francisco, 117 Cal. App. 2d 749, 256 P.2d 999 (1953)

Clarkson, K. W., Miller, R. L., & Jentz, G. A. (2003). West’s Business Law Text and Cases, 9th . Thompson Learning.

 
1 Comment

Posted by on November 17, 2012 in Business, Information Technology

 

Selection of VPN Solutions

Over the last decade (and specifically since the revision of the encryption export control regulations); there has been a considerable amount of research, development, and adoption of VPN technology sets used by not only business and government, but also in consumer markets. Indeed protection of data while in transit, and at rest, is a best practice in industry, a requirement in Government, and is finding use in a growing number of commercial applications. Given the wide array of available technology, the question of which type is best doesn’t really need to be asked. Use of techniques at the application layer such as SSL and SSH, technologies at the transport layer such as IPsec or PPTP, at the link level with L2TP, and while at rest using either hardware or software disk encryption can all be used together to ensure that data confidentiality and integrity is maintained throughout its lifecycle. Even with this complete stack, these protocol selections only touch on the various options, combinations and uses of encryption within each layer.

Given so many options, the selection of which technology to use – like many other things in IT must be based on the answer to “what is it you are trying to do”, “who is going to be using the system”, and “what are your requirements”. At the client end, the simplicity of implementation has become nearly trivial. Window 7, for example, provides native support for IKEv2, PPTP, L2TP/IPsec, and SSTP with a single drop-down selection. Authentication protocols for these can be matched with current infrastructure requirements, and data encryption algorithms can enforce data protection by encrypting within the established tunnel. With so many options, the classification of type will depend on what the requirement is. Should it be classified by authentication mechanism, by layer of operation, by supported encryption strength, or by the endpoints being connected? The short answer is … it depends on what the VPN intends to accomplish.

At its most basic however, the VPN is designed to provide remote-connectivity from one system to another. This connectivity may be to support remote device (e.g. a travelling laptop) access to the corporate LAN, or to interconnect LANs to each other in a MAN/WAN scenario. The purpose of either approach, at its simplest is to provide secure connectivity between these systems for the purpose of protecting the confidentiality and integrity of data in transit. Given the vast array of options as endpoint devices, software methods, hybrid approaches and the various protection levels (and associated cost) that can be provided by each … selection is in the eye of the beholder.

//Levii

 
Leave a comment

Posted by on August 24, 2012 in Information Technology

 

The Principle of Least Privilege

Introduction

The materials on http://cccure.org, and specifically the guide by Rause & Tipton certainly supplemented the material by Shon Harris while I was studying for my CISSP. Even though I haven’t seen this page since sometime in 2007, a recent discussion thread re-linked it, as a starting point for a discussion on the principle of least privilege.

The Principle

While the principle of least privilege can be briefly summarized and defined, the context in which it is evaluated through a given access control method is quite a different undertaking. The principle of least privilege can be succinctly defined as “limiting the access of authorized users to data they require to perform their duties” (Conrad et. al., 2010)(p. 47). All access not required would then be denied. This is slightly less restrictive than the need-to-know standard, which would be a subset of the authorizations offered in the least privilege approach. Both can be considered “deny by default” provisions where access must be specifically allocated by a person authorized to grant that access.

Authorization to grant access is the key delineator between the two primary access control methods of Mandatory Access Control and Discretionary Access Control. MAC restricts this authorization to an administrator of the data; owners and users are unable to provision or grant rights to other users. In the DAC model however, this privilege is granted to data owners and creators; effectively decentralizing the assignment of privilege to content owners.

Not well-covered in the linked article by Rause & Tipton is one additional model: Non-Discretionary Access Control. This access control methods typically though of when discussing NDAC are Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC). While many will argue that it belongs as a subcategory of MAC; it exists as a separate entity. Consider the following:

  • “…RBAC is sometimes described as a form of MAC in the sense that users are unavoidably constrained by and have no influence over the enforcement of the organization’s protection policies. However, RBAC is different from TCSEC (Orange Book) MAC.” According to NIST, “RBAC is a separate and distinct model from MAC and DAC.” This is a frequently confused (and argued) point: non-discretionary access control is not the same as MAC.” (Conrad et. al., 2010)
  • The NIST source cited by Conrad has moved, but is is available here:http://csrc.nist.gov/groups/SNS/rbac/

At this point: least privilege, need-to-know, and the models to support the assignment of authorization have been defined, the next area to tackle would be that of policy and supporting models. Supporting models are subcategorized into a matrix of security models which include: confidentiality, integrity, information flow, noninterference, take-grant, access-control matrices, Grahm-Denning, Harrison-Ruzzon-Ullman, Zachman Framework and Brewer-Nash models.

These models do have overlap. For example the Lattice-Based, State Machine, and Bell-LaPudula models are all confidentiality models, while the Biba and Clark-Wilson models are integrity models. Each of these, however, falls into the larger category of information flow models.

Avoiding the temptation to define and explain each of these, it will at this time have to be sufficient to say that these models are then supported by the system’s mode of operation. Modes of operation are codified within the Common Criteria adapted from DoD 5200.28 and provide modes for each of the MAC models for operation.

While these models themselves do little to protect organizational assets, the adherence to the governance policies they define (including both administrative and technical controls) provides the structural foundation to protect the confidentiality, integrity and availability of information.

 

References

Conrad, E., Misenar, S., & Feldman, J. (2010). CISSP Study Guide. Syngress: Burlington, MA.

ISO/IEC. (n.d.). Common Criteria Documents. Retrieved from NSA Common Criteria Evaluation Scheme:
http://www.niap-ccevs.org/cc_docs/

Krause, M., & Tipton, H. (1997). CISSP Open Study Guides . Retrieved from Handbook of Information Security Management:
http://www.ccert.edu.cn/education/cissp/hism/087-089.html

NIST. (n.d.). DoD 5200.28. Retrieved from Computer Security Research Center:
http://csrc.nist.gov/groups/SMA/fasp/documents/c&a/DLABSP/d520028p.pdf

 

 
Leave a comment

Posted by on August 9, 2012 in Information Technology

 
 
%d bloggers like this: