Tuesday, August 1, 2017

Bottle and Labels: The Latest Victim of the Modern Age

Nature posted a story last week about some new scientific findings that appear to bust a relatively recent verbal construct that has been rather central to the HR world and the discussion of Millennials in the workplace: the so-called "digital native". It seems this term, created to bottle certain people up into an easy-to-understand construct, may be similarly as meaningless as the now increasingly derided concept of "learning styles".

I have to admit, I somewhat blindly accepted the implication of this "digital native" label myself. Being a member of the newly minted "xennial" micro-generation (oh dear), it made some amount of sense that everyone born after me, or at least most those in the Millennial group, were far more accustomed to digital everything, and analog nothing. For example, how many minutes has it been since you're heard another Gen X or older individual express concern over the lack of cursive classes in today's grade schools? Quick side note: if Millennials are digital natives, what, then, are my two children, both of whom were born post-iPhone and post-iPad, and look at the corded phones that access land-based wired telecommunications as if they were technological fossils of a long-expired civilization?

But back on topic, it seemed to be almost self-evident that digital natives (assuming those exist or existED) had not only a, uh, micro-perspective that wrapped into their generation's perspective, but assumptions and ways of working that were unique to their group, and perhaps even advantageous. They've grown up with keyboards, massive amounts of time sitting in front of screens usually attached to a computer-like device, absolutely stonking amounts of computer storage and processing capability, even wireless cell phones before the smartphone revolution of the last ten years. The research findings in Nature's article do make sense, however.

Keyboards are just tools, and Millennials possess no ways of using I/O on computers and smartphones inherently better than anyone from another generation. They let their email inboxes fill up like (most) everyone else. They play the same games and use the same media available to everyone, all of which has gone electronic. They frequent many of the same websites while gravitating away from traditional cable media services. They have their noses stuck in their phones (just like nearly every adult of every generation patronizing the coffee joint I'm typing this in). Their expectations for how life should move along may be different, but that's not so much a "digital native" thing as it is a Millennial thing, and even less so when one considers the general trends across global society, particularly in workplaces enabled by technology.

I have personally come to a place where I am rather oppositional to the "Millennial" label, so I feel I should divest myself of other labels as well, however well-meaning they may be. "Digital native" should probably be dropped because of at least some assumptions we can now apply data to, that falsify at least some it. I consider myself a digital native, even though I'm a late Gen X. Before I had even completed high school, I was using Windows PCs, and once I entered college in the mid-90s, nearly everything was handled by computer. 21 years later, I do nearly everything not requiring physical effort on a computer or smartphone, and my handwriting has suffered to the point where the old joke about doctors' handwriting applies quite well.

So we as knowledge workers, who may not have grown up with half the things Millennials did but surely use most if not all of them now, should probably drop the labels and focus on solutions for people where they are, and not where they "should" be based on the shelf their bottle - now broken - is expected to be.

Monday, February 27, 2017

Is eLearning Really on the Decline?

I was reading Karl Kapp's blog post from 1/31 about market reports on the decline of self-paced elearning. I would agree with the comment left by another reader that replied to his story: Can we really say that self-paced elearning is on the decline? Because that's a really grey area and methinks the tools to capture this information aren't honed well enough to make this claim.

...but first, a message from our sponsor...

Let's get one thing straight up front: elearning is not going away, though it may be take on a new term at some point. I don't see self-paced elearning declining, though it depend on where it's recorded that will potentially return a "declining" status in market data. While self-paced elearning may be declining in enterprise (indeed, a great many fewer corporations want churn-n-burn, objectives-content-summary-test modules), "elearning" is a broad term. What does seem to be growing, are much more produced and designed learning experiences in the form of games with badges, point systems, etc. But are these not also elearning?

The death, or at least waning importance of, slideshow-based "courseware" elearning is certainly true, though there's still some basic relevance for it for quick data dumps, minor topic update, and reporting purposes. So it is perhaps more apt to say that basic, or what would be considered "bad" or undesigned, elearning is declining.

Kapp closed the blog post by highlighting the continuing and likely growing need for Instructional Design. This is also true. Truly deep and impactful learning game experiences cannot exist without thorough design.

True, but...

But to vindicate "bad" elearning a bit, or at least explain why it came about first and remained relevant for so long, it was like any new tool humanity finds. In the 1990s, society had this thing called computer technology and the Internet that exploded into our daily lives, having spent the first part of the decade creeping onto our desks at work or our dens at home. Once computing power and Internet connectivity were roughly ubiquitous enough for some meaningful democratization, naturally some would ask the question "How can this be used in the employ of education or organizational development?"

Like children fueled by curiosity, lack of experience, and a new tool we've never seen before, we grabbed something and ran with it, sharp end first. This predictably lead to a lot of doing the right thing the wrong way in the early going. Companies have realized, over the last 15 years, that throwing money at these programs could be and in many cases was somewhat wasteful. Just as they realized between 2000 and 2010 that sending people cross-country for in-person training sessions wasn't panning out well.

What Could This Mean?

Today, the focus is shifting to the individual taking on more responsibility to upskill themselves both within and without a job. How that is accomplished is up to each person, and that is where I think the data is potentially skewed to represent a greater decline than exists.

Monday, December 19, 2016

Startup Pitfalls: Overpromising the MVP

By now you may be aware of the fate of Theranos, the embattled medical testing startup that was the darling of Silicon Valley just over a year ago. If you are not familiar with their story, I advise you to go research it as it is perhaps the prime example of great intentions and vision in a Silicon Valley startup being sunk by secrecy, lack of execution, and failure of critical examination as just a few of its problems.

For those familiar with the startup, the paper-thin foundation upon which Theranos was built began collapsing after a landmark Wall Street Journal report on the company, released near the end of 2015. The report detailed how Theranos had yet to demonstrate not only the use of its vaunted-but-unseen blood testing tech in research, but any data-based results whatsoever while continuing to use existing medical tech to perform all marketed tests. In the wake of the WSJ story, time kept passing, Theranos' workforce mysteriously dwindled while unqualified persons ran key aspects of their medical operations in violation of federal law, questions began being asked by more and more people, and today Theranos appears to be on hiatus, and may never recover. Theranos' founder Elizabeth Holmes, until recently Silicon Valley's newest rising star, is now often seen in social media memes touting her quote about intentionally foregoing a backup plan for her company.

After I read this story last week, it seems that one could be forgiven for thinking Magic Leap's name could be used in place of Theranos. This is another case of hype and early investment exceeding the MVP (minimum viable product)...or an MVP at all.

Magic Leap has been a darling of the AR (augmented reality) market, a startup that has supposedly created tech that can superimpose amazing visuals over what people see using a headset device. Perhaps needless to say in this age of tech startup stardom, Magic Leap has secured funding and global investor interest to the tune of a rough valuation near $5 billion. They made a promise to their market, but have yet to deliver on it. Now, after a bit of digging by the press, it seems Magic Leap is much further behind the curve they were thought to be ahead of.

Rony Abovitz, founder of Magic Leap, is telling everyone to have faith. "Believe" he tweeted earlier this month. But how is anyone interested in his company's technology supposed to accept that kind of furtive solace when it has been revealed that Magic Leap's hype-generating video was not only not a display of their tech in progress, but was essentially a Hollywood-quality ruse for what their technology eventually would do (in the vacuum of a delivery time frame)?

I'm very interested to see how Magic Leap's story plays out in light of the Theranos debacle. There are many parallels that I see just on the surface. Theranos suffered from good intentions attempting to take the shortcut to financial and economic glory through Silicon Valley hype-sterism and a reliance on an executive and advisory team that was not sufficiently circumspect of the plan or demanding of results.

Hopefully Magic Leap's story doesn't include all of these weakening elements, and will take a better path in the near future.

Thursday, September 22, 2016

Changing Focus...

A few years back, I started this blog to discuss educational technologies and interventions for improving human performance. I borrowed the blog's byline from the movie The Matrix, trading spoons for calcium carbonate. My point in selecting this quote was to note the ever-changing nature of what a "solution" is, and that true solutions never really fit in molds or follow conventions.

I want to redirect the meaning of my borrowed metaphor by shifting this blog to focus on the emergence - or rather, re-emergence - of VR, virtual reality, as a viable technology for solving problems. I will also be including augmented reality (AR) and mixed reality (MR), as I envision a future that harnesses all three of these technologies rather than pitting them against each other. But other technologies that will rapidly change how we learn are those that power how the machines around us learn. Machine learning (ML) and data mining (DM) will allow computers to begin to generate VR environments for organizations. The big data (BD) that enterprises have been gathering for a few years now will become the database from which the data mining can be done.

The other obvious augmentation to all of these is to bring mobile into the mix, but for now, I want to emphasize my shift to a technology focus, here. I want to make sure I note that simply using tech isn't enough to serve as an OD/OM/JIT solution in itself. In that spirit, here is a smattering of the VR/MR/AR and DM/ML/BD developments from the last few years, that will fuel the next decade of advancements:




Tuesday, February 16, 2016

Project-Based Work Evolution Breeds Project-Based Education

The Chronicle of Higher Ed posted an interview with Christine Ortiz, Dean of Graduate Ed at MIT, where she discusses her perspective on higher education, and how she sees higher ed evolving in the future. Like many of her peers, she believes that concept of tenure is on its way out. What I liked most about her perspective is that she sees education becoming more project-based and on a long-term schedule, with the tools of traditional instruction taking a peripheral seat to be accessed when and where necessary. In this sort of model, the lecture is no longer monolithic, and any "lecture" can be as short as 5 minutes. There are plenty of examples already using the short lecture format (Udemy being one).

I like where she's going with this, and I'm already envious of future generations that will have the opportunity to learn in this way. But I see this format changing not only traditional higher education, but other formats like vo-tech, which itself is already changing with a proliferation of job prep and technical electives at the high school level. And where do the essentials of education, the 3 Rs, factor in? What would a holistic higher education program look like, that combines long-term, multi-year projects centered around education, with real-world experience as the central modality? The ideal end result of such a system would seem to be a market-ready individual who, using traditional time metrics, has a portfolio of projects and a wealth of experience gained in parallel.

There will certainly still be challenges, though. Project selection and a life cycle view would become paramount concerns early on. It would not be beneficial for the individual for projects to become obsolete a year or two in, or lose value by the end of the education process. The speed at which markets and industry move, and their rate of acceleration, will make this aspect of future higher education very challenging.

Another challenge will be keeping learners (née students) afloat of those technological developments well enough that they don't fall behind.

A third is likely to be a "client-side" feature that higher ed may have to adopt for many fields, which is the ability to participate in a virtual in-person fashion. This is technology that is still coming about, but could prove to be a boon for Ortiz's vision.

Will this model shift higher ed to a year-long format?

Will FYE (first year experience) courses become fundamental inclusions? Will these become part of the late high school experience?

Where will the lines between education and work be drawn?

The future of higher education is daunting but holds increasingly limitless possibilities. I'm very much looking forward to seeing how this particular future plays out.

Thursday, February 4, 2016

Random Thought...

Tip the half empty/half full experiment on its head:
Do you think "necessity is the mother of invention", or that "everything that can be invented, has been"?
Maybe I'm optimistic, but I think most people, when asked, would choose the former over the latter. Our 21st century times practically demand it, almost instinctively.

I hit upon this thought while trying to get my kids off to school this morning. What are your thoughts? Consider if you were asked this in a job interview; what would your response be and how might you elaborate?

Friday, January 8, 2016

Bad Beliefs: Talent

How many of us have either been told, or complemented someone on, their "talent" for accomplishing things in a certain area? Some have heard it more than others, but the popular notion that a person is "born" or predestined to be exceptionally good at one thing over others is one of the most well-known, intangible social memes. You hear it a lot in discussions of sports figures. I've always had a bit of a problem with this concept because the logical implications for the ostensibly "untalented" are not generally positive.

The Huffington Post had a story yesterday that has links to some of the basic discussion of this phenomenon from a scientific and media perspective. This is a growing area of scientific inquiry, but evidence is already showing that talent, as popularly defined, is nothing more than a myth.

Wait a minute...let's decontruct the word first:
Talent: "inclination, disposition, will, desire"
Ok, that's etymology. What about the definition?
Talent: "a special ability that allows someone to do something well"
Notice the difference in the construction versus accepted meaning. Inclinations and desires can be adjusted; special abilities cannot.

Donald Clark has had a negative view of talent for some time, and I agree with him. Simply put, talent is a negative, limiting belief that negatively affects how we judge the effort and outcomes that others produce. It's simply incompatible with modernity and the reality of organizations that need individuals to do more with the hands already employed.

A decade ago, a group of scientists was already finding evidence of what everyone already knew: that hard work and determination were correlated with success. This is another popular meme, not just from our own individual lives, but from observing the fruits of others' labor in this Age of the Entrepreneur™. The Economist just posted a story about new research by Chia-Jung Tsay of University College London that deconstructs the myth of talent further. Her findings show how ingrained the notion of "talent" is by pitting investors against a group of entrepreneurs divided into "naturals" and "strivers".

In her results, the naturals were almost always seen to be the more enticing bet to receive investment dollars because the belief is that they have to work less hard to achieve great results, regardless of IQ and other factors. The strivers, on the other hand, were given the distant-second nice try award for effort. The strivers always faced an uphill battle that, while they had the grit and determination, they were perceived to take longer and require more effort to achieve. In essence, the money follows a perceived ability to deliver more with less energy.

I see a further problem: A false dichotomy that has us choose between either talented, or untalented. But as the evidence of Duckworth et al in the 2005 study shows, even individuals identified as uniquely talented are further identified as a "prodigy" (another term with no quantifiable definition) very infrequently.

What this means is that socially we have put a lot of intangible weight on things that don't actually exist. It's no longer a question of talent versus non-talent. In my opinion, today's enterprises simply cannot afford to look at their people as one or the other, and reward them as such. And the real-world results bear this out: many companies today are looking for ways to get existing employees to extend their experience beyond their job description. Those efforts have mixed results depending on a company's individual outlook on OrgDev, but those efforts are real.

...socially we have put a lot of intangible weight on things that don't actually exist.


And to interject a bit of my own life experience, as a parent I cannot afford to teach my kids that they are talented only in one area. I want them to experience so many different things, and talent is a vestige of the past that is no longer useful. If it were, how do we explain people like Andre Agassi, who hated tennis yet achieved greatness in that sport? Or Tiger Woods, who hasn't set a golf club down since the age of 4? If sports is a great place to prop up the notion of talent, it's a great place to disassemble that bad belief as well. I want my kids to practice the true meaning of "talent" - the desire and perseverance - not the accepted definition that they are either gifted one thing or another, excusing lack of effort elsewhere.

So which would you rather have in your organization? 1 talented individual, or 10 persevering ones?

Thursday, January 7, 2016

Learning UX: From Brick-n-Mortar to Virtual, Spitballing the Future of Virtual Desktops

Catherine Lombardozzie, writing for ATD's online magazine, expounds on the characteristics of a true learning environment in an organization. She's essentially making a case for talent development professionals facilitating an environment where learning can happen at whatever level of formality and pace each individual deems necessary. Her points are all sound, and I think they serve as a reminder for those providing training products and services from the perspective of an external vendor.


My role is essentially to ensure most of Lombardozzie's points are represented in. For the past several years now, I have been interfacing with clients in efforts to build experiential learning environments, primarily in the form of virtual desktops. Until recently, providing a desktop with installed software for a specific course, either general or specialized, has been mostly sufficient, and the technical features completed the experience. But I've seen that really begin to change in the last year.

The time has come that I must now think about virtual desktop deployment in the holistic sense; programs over courses, unique experience over simple service delivery. Customers want something beyond a desktop with software, because the increasing profligacy of virtual machines and the software that run them has made the delivery of such a service more commonplace. Taking some of the points around meeting the basics of the experience, what was cool in the past is not that interesting anymore because the market is familiar with it now, and it now wants something more.

Gamification is often a road that is considered or requested, but due to the current state of the desktop virtualization market, there are few, if any, integrated tools that can allow this kind of functionality and experience in a cohesive, cost-effective way. The typical path of such discussions often becomes how best to construct a more "game-like" environment in a given course, using manual means that with the grading structure as the basis. A hard sell for major higher education institutions, to say the least.

So what can be an ideal user experience in virtual desktops for training?

My sense is that the user experience can incorporate most of Lombardozzie's points as regards resources and curation. After all, it is becoming increasingly likely that, just as society moved from single-car households to more than one, so too will our computing lives move from a single computer to many. We see this already in the power contained in our smartphones, and how cloud services link them to our only somewhat more powerful laptops and desktops. It's feasible that people will soon carry virtual desktop files or stick computers that are intended for specific uses such as education. Once that happens, it's further possible, I think, that services will be built and offered around supporting such products delivered as a cost item or service, and around curating them for users.

Do you work in or with virtual desktops? What do you see for the future?

Thursday, September 3, 2015

The Brass Tacks of Individual Development

Michele Martin on Friday tweeted a link to a Fast Company article that talks about an SAT test prep platform that helps aid student success while predicting how successful they will be. Michelle's comment that there could be interesting implications for adult learning and motivation is probably an understatement.

Michele had earlier written a blog post about how the very notion of a job is changing, and how people can generate multiple streams of income by thinking from an entrepreneurial perspective. She and many others are right: with the relative democratization of options for gathering knowledge and experience, so too must peoples' education and "job" prospects be diversified like a stock portfolio.

Speaking with respect to SAT test prep, the problem seems to be self-motivation and that all-important work factor. From the Fast Company article: "The problem...[is] that kids don't do the work because there's little immediate, incremental incentive, and it's piled on top of an already full load of classwork and activities..." One indicator of this, to my mind, is the proliferation of first year success programs in higher education. Many of them focus on educational basics I was required to have to enter college in the mid-1990s. Many people coming into community colleges and public universities, as well as for-profit institutions, come from a less fortunate educational background, but still seek higher education. However, many of them don't know how to cope with a model that essentially inundates them with choices while they lack the glue of motivation. What's left is a hybrid of a prescriptive learning model intertwined with the feel of completely open education.

Some will have the motivation to push through the early period to success. Others will lack that, and the risk is that they will feel forced to stop or put off their education. Testive, I think, hits on a workable model for motivation. Such methods were very helpful in my recent success in passing the PMP exam. The possibility of passing that test would have been remote for me if I did not have practice facilities that mimicked the real thing while providing immediate feedback and a focus on weak areas of knowledge. While Testive's platform focuses on SAT and ACT testing, I think their approach could be applied to all sorts of areas.

So how does a test success algorithm relate to career entrepreneurship at the individual level?

With the democratization of educational options, it's becoming easier for someone to steer their career in new and different directions, or to build on the current course, on a proverbial dime. Tools similar to what Testive offers can help people achieve goals more quickly by predicting success through the targeting of weak areas.

My overall perspective is that more and more tools are becoming available that will drive solutions to let people guide themselves through the entire education and career launch process. We may eventually reach the point where people onboard themselves into a company by testing and passing a particular standard for performance in a particular role. There are all sorts of positive and negative implications for this, once society has determined how best to resolve motivation problem. Society already seems to be distrustful of a future where advancement in anything depends on cold, quantitative analysis of everything. The positive side affords freedoms our forebears couldn't conceive of, though that may look like a precipice to some.

In all honesty I have some genuine trepidation about these developments as well. The future will be interesting one way or another.

Tuesday, August 25, 2015

Self Awareness and the Performance Review

Harold Jarche's weekend post relates a story from Harvard Business Review. It's about a sort of star candidate who was groomed to lead an overseas division of a company. All the boxes for him were set up to be checked off, but as his leadership began and things progressed early on, outcomes didn't go to plan. The status quo was followed too closely, and increasingly the operation of the business and the division suffered. How could this be?

So someone else - more or less the diametric opposite, as far as the details of the story let on - was put in place to lead the overseas division. Success resulted soon after. The moral of this particular story is that the status quo and the leaders working within it may not generate the expected level of success. But this wasn't what I took from the story.

My takeaway is the underlining of the importance of self awareness, and using external feedback to obtain that perspective. John from the story appears to have been someone who felt the system would handle whatever difficulties came up. He delegated his leadership role to the system he was supposed to be guiding. His replacement, Alex, took an active role. John didn't bother learning; Alex did. But beyond that, it didn't appear that John did much work in the way of checking his performance with his superiors. Alex, on the other hand, bucked the status quo while making begrudging admirers out of them.

Something I've done at various points in the positions I've held is to initiate off-the-cuff performance reviews. The reason I've done this is to address any concerns I may not be aware of, before someone prepares to address them with me. I've also felt that it shows some initiative and struggle to maintain a learning and improving ethic in my profession.

What about you? Do you initiate informal reviews of your performance? What ways have you sought out feedback not directly related to a training or major process change event?