Monday, May 29, 2017

Disaster Recovery vs Business Continuity

Disaster Recovery is not the same thing as Business Continuity. For an example, you need look no further than the recent British Airways system failure. Lasting several days, the outage affected more than 75,000 passengers across 170 airports in 70 countries. During the outage, British Airways could not process passengers, meaning outbound flights could not take off, and planes were stuck at gates.

The problem began with a power surge that "only lasted a few minutes", but the back-up system had not worked properly.

There's a lot of finger-pointing at British Airways, but I prefer to use this as an example of how Disaster Recovery is not the same thing as Business Continuity.
A disaster is when things fail.

Disaster Recovery is the IT folks bringing servers back up and getting computer systems back online.

Business Continuity is how the business keeps operating in the face of the IT system failure.
In the case of British Airways, the IT folks were busy restoring power to the mainframe, bringing up the backup system, and synchronizing data. That's Disaster Recovery.

Meanwhile, in the airports, on-site staff used whiteboards to communicate flight status, and used runners and other communication methods to get information between gates. That's Business Continuity.

Take this opportunity to review processes and procedures in your own organization. What does your Disaster Recovery plan look like? Do you have one? Does everyone in the IT organization know how to bring systems back online in the face of failure?

At the same time, what does your Business Continuity plan look like? This needs to come from your business units, to plan how they will keep operating if the technology becomes unavailable. The Business Continuity plan cannot be to yell at the CIO until systems come back online. Every business unit needs to create and manage their own plan about how they will continue to do business in the sudden absence of technology.

Monday, May 15, 2017

Common security issues

I have been working in IT for over twenty years. I got my start as a Unix systems administrator at a small company, then moved into a "working manager" role at another small company, before moving into higher education where I eventually led the Enterprise Operations and Infrastructure teams for a Big Ten university. After that, I transitioned to a campus leadership role, as campus CIO at a coordinate campus. Now I am the chief information officer for county government.

In my time working in technology, I have watched the rise of online security. When systems are connected to the Internet, they are at risk to cyber attack. In 2017, if cyber security is not a key focus area for your organization, you are falling behind.

As if to demonstrate, the UK's National Health Service (NHS) was recently stung by ransomware, a common form of attack. This was one of several WannaCry attacks worldwide. And if  you're still running Windows XP, it's only going to get worse.

What are the security issues you should be most concerned about? While I don't claim to be a security expert, I can speak from some experience here. This is part of my list of the top security improvement opportunities for most organizations:
  1. Patching, and retiring out of date systems
  2. Intrusion detection
  3. Automated system monitoring
  4. Private IP space and bastion hosts
  5. Distributed architecture
  6. Separation of privilege
  7. Access controls review
  8. Single sign-on, with two-factor authentication
  9. Firewall and VLANs
  10. Data at rest encryption
What would you add to the list?

Monday, May 8, 2017

High academic vs Low academic

I loved this article from The Conversation, about how Academics can change the world, if they stop talking only to their peers.

Having spent seventeen years in higher ed, much of the article resonated with me. From the article:
This suggests that a lot of great thinking and many potentially world altering ideas are not getting into the public domain. Why, then, are academics not doing more to share their work with the broader public? The answer appears to be threefold: a narrow idea of what academics should or shouldn’t do; a lack of incentives from universities or governments; and a lack of training in the art of explaining complex concepts to a lay audience.
The article reminds us that many academics don't believe it's their job to communicate with the lay public. There's really no motivation to do so. If you're chasing tenure, writing in trade magazines or newspapers may not "count" towards your publication credit, anyway. And once you have tenure, writing for non-peer-reviewed venues may be considered "cheap."

I observed that attitude several times when I worked in academia. I remember our campus held an annual celebration of scholarly faculty achievements. Faculty were invited to submit samples of their work that had been published in the previous year. I also remember commiserating with friends in the faculty who failed to get recognized because they wrote in non-academic journals and magazines. They wrote for the common reader, not other academics.

The article also highlights that even when academics feel motivated to write for a broader audience, it may be difficult for them to get published. As the article points out, "Writing an article for an academic journal is a very different process to penning one for those outside the academy." When you've been trained to write "academic" speech for your whole career, it can be difficult to construct prose that is more approachable.

I have a similar view. When I was in my Master's program, I referred to three different styles of writing: "High academic," "Medium academic," and "Low academic."

High academic is typical for many peer-reviewed journals. This writing is often very dense and uses large words that demonstrate the author's command of the field.

Medium academic is more typical of undergraduate writing. It is less formal than high academic, yet more formal than what you find in the popular press.

Low academic tends to include most professional and trade publications. Low academic authors may sprinkle technical terms here and there, but generally write in a way that's approachable to their audience. They use contractions, although sparingly. Certain other conventions continue, however. Numbers are written out; "fifty" instead of "50," and "two-thirds" instead of "2/3."

In my Master's program, I learned to adjust my writing style according to my instructors' preferences. One professor might have a very formal attitude towards academic writing, so I would use High academic. Another professor might approach the subject more loosely, so I would write in Medium academic. When I translated some of my papers into articles for magazines or trade journals, I wrote in Low academic.

Academics need to adopt a similar approach. It's okay to write in High academic when you are writing for your peers, in academic journals. But the academy needs to be visible to the public, to share their findings in a way that non-academics can understand. That means adopting a Low academic style.

The article concludes with a similar message, but also adds that universities need to change their attitude about what should "count" as "publication," saying:
Doing this sort of work ought to count towards promotions and should yield rewards for both universities and individual academics. Quality academic research and innovation are crucial. It is equally important, though, to get ideas out into the world beyond academia. It could make a real difference in people’s lives.
And I agree.

Thursday, May 4, 2017

My start with Linux

When I was an undergraduate physics student, I discovered this little thing called Linux. It was a great Unix operating system that I could run on my PC. And I've been a Linux fan ever since.

I wrote a full story for OpenSource.com, about How I got started with Linux. Please read it!

Monday, May 1, 2017

Every meeting

I'm taking a break this week (I'm grading final exams for the online class I'm teaching this semester) so instead of a longer post, I wanted to share this brief video from FastCompany.

» Every meeting you've ever been to (in two minutes)

How many meetings have you been to that are like this?

Sure, the video is funny, but that's not the reason I shared it. I'm hoping the video may knock you out of complacency, and get to you recognize any of these bad behaviors in yourself (if you have any). So if you're exhibiting any of the traits from this video, if you listen to yourself in your next meeting and hear yourself saying some of these trite examples, you need to find new ways to express yourself.
image: FastCompany

Friday, April 28, 2017

Raising the bar on technology

I wanted to expand from my article this week about how technology should contribute to an organization. Technology systems can be a boon to organizations, when leveraged effectively.

But we also need to manage that technology effectively, too. Too often, I have observed IT organizations fail at systems management. "Information Technology" is more than setting up a few servers. You can actually put your organization at risk if you don't put in place certain safeguards to keep systems available to your end-users.

I have worked in technology for over twenty years. Much of that time was spent in infrastructure and operations, including leading the enterprise infrastructure and operations teams at the University of Minnesota: my teams supported over 1,100 servers at the university, running critical systems for over 65,000 students and delivering over 33,000 paychecks every two weeks.

In that role, I was frequently audited by our internal auditors, on average a little over once a year. Not because I was the subject of an audit; my department was audited about once every five years. Rather, because if the auditors examined another department, eventually they would review the servers that support them, and that meant me.

So through my own experience, I have developed a (growing) list of best practices to manage technology, at a level that satisfied our auditors. Here are a few highlights from that list:
  1. Redundancy in data center
  2. Architecture review
  3. Application management lifecycle
  4. Backup validation
  5. Disaster recovery planning
  6. Risk analysis
  7. Business value mapping
  8. Configuration management database
  9. Job automation
  10. Isolated file transfer
As I review this list, I think the data center (the first item in the list) is falling in importance. Most organizations are moving to Cloud, and I think that makes sense for many applications. Certainly we will always have some applications hosted locally, but it's not hard to predict that local data centers will be minimized, running only those systems that must connect to local devices (such as research, or managing IOT) or cannot be migrated to the Cloud for other reasons.

How does this compare to your environment? What would you add to this list?

Monday, April 24, 2017

Rethinking today's IT

Enterprisers Project ran an interesting article last year about Rethinking how IT drives business value in a digital age. A year later, the article still holds true on several levels.

In the article, author Sven Gerjets argues that as technology becomes more commoditized and instantly available, IT organizations need to rethink how they bring value. Gerjets writes:

"IT is now competing in a segmented marketplace where technology is far-reaching, easily accessible, and created at high volumes, which allow for economy of scale. … Competing in this consumerized landscape cannot just be about technology. Technology is too accessible; IT has to be about value creation and about learning."

Of course, IT must occasionally re-invent itself to remain viable. I've written about this before. In most original office settings, "IT" often referred to the folks running and supporting the mainframe environment. But as the cost of computing dropped, departments could purchase their own technology. In the 1980s, we saw an influx of "personal computers" in the business setting. Suddenly, "IT" had to compete with the PC, and IT adopted the PC as a business tool.

I got my first job in technology in the mid-1990s. Throughout my career, I have witnessed several upheavals in technology. The Web meant IT no longer had central control. Mobile devices like tablets and smartphones meant the "computer" was no longer a monitor sitting on your desk. The Cloud meant the network was the computer.

Every few years, IT needs to examine itself to see how we bring value and how we drive the business. As IT, are you "just" a support group, or are you engaged in how your organization operates? IT needs to empower organizations, and be a partner and a leader. Gerjets agrees, and concludes his article this way:

"We have to create an environment where our organizations are empowered to consciously learn, evolve, and raise the bar every day. This must be a learning organization that can intelligently collect information about its environment, is open to learning from others, learns from its failures and successes, experiments with new approaches, encourages problem solving, and most importantly retains and shares this information across the functions in IT. Organizational practices must be developed to systemically utilize these information stores in the IT planning processes to manage risk and to ensure that failure is not repeated while successful practices are repeated over and over again."