A shout-out to Cambridge Savings Bank for great customer service

It sucks to write checks for the IRS. But then again… I’m happy to contribute my share for the greater good. No big deal. A bigger deal would be losing money because of oversight.

So the other day, I had to write a check for what I owed the government. I filled it out, mailed it, out of sight, out of mind. But then, as things go, I remembered this while driving around in my car through beautiful Somerville MA. The morning of that day I had checked my account balance, and for some reason my brain put 1&1 together and surfaced the thought that I don’t have the balance for the check to go through!

Luckily, my bank, local Cambridge Savings Bank, offers text banking! So as I got to a red light… (ha! Thought you had me there, didn’t ya) I reached for the phone that, of course, already sat on my dashboard, and switched to the messaging app. I texted the words “transfer 2000” to the bank. A few seconds (!) later, they confirmed that I had successfully transferred $2000 from my savings to my checkings account.

What had happened? Within a mere 15 or 20 seconds I went from thought of the moment to resolution. And all because my bank got four of the following “now consumer” (me!) expectations right:

They “let me do it”. They “made it mobile”. They “fit into my life”. They “saved me time”.

Here’s another example why I consider myself already a loyal customer. A few months ago I asked them whether they supported Apple Pay with the MasterCard I have. I asked this question on Twitter; a convenient channel for simple customer service inquires. They got back to me. Not in record time, but that wasn’t needed for this type of inquiry. They told me they didn’t support Apple Pay yet, but would tell me once they did – they were working on it. Did I honestly believe they would remember telling me? Not really. However, a few weeks ago, I got a tweet out of the blue: Cambridge Savings Bank informed me that they now support Apple Pay, adding a link to more information. They did remember!

They “know” me. They “made me smarter”.

Finally, I get frequent updates by email from the person that setup my account last year, proactively, without me even asking for it, whether the great interest rate that made me a customer in the first place, was being continued beyond the promotional timeframe or not. (It is.)

OK, one more. Just a few days ago, I needed to deposit a check that had a higher amount than what the deposit function in their mobile app allowed. I wasn’t in the mood for going into the branch, so I asked via email if they could help me somehow. Within an hour or two the mobile deposit limit was raised temporarily. I could deposit the check the same day.

They “made it easy”. 

Here’s a shout out to you, Cambridge Savings Bank. Thanks for providing great customer service. Keep doing what you’re doing. (Oh and if this post brings you a new customer or two, why don’t you keep up that interest rate on my money market account for a bit longer… ;-))

Sincerely,
A loyal customer.

Digital Touch: The Hidden Gem inside Apple Watch

I think Apple brought the “cool” back. Here is why I think the Apple Watch has the potential to be the company’s biggest success yet.

Social communication is everything. It’s the essence of mankind. No other “man-made” system is more complex than that of the human language. Having studied computational linguistics and phonetics at Bonn University, I got a glimpse of the intricacies that lie in our combination of sounds, gestures, mimics that make up this system. The sound system alone is more than just abc, the magic lies in what linguists call the suprasegmentals, such as melody, intonation, tone, stress, pitch, even volume.

When we added to spoken communication a system of written language some hundreds of years BC, we did just enough to carry the most basic suprasegmental traits over with punctuation and diacritics. Since written language was not primarily meant to represent everyday dialog but rather thoughts, facts and argumentation, the set of expressions of suprasegmentals available was just enough to avoid too many misunderstandings that a richer system of cues would help avoid almost completely.

When quicker forms of communication than the basic letter developed with the advent of the Internet, such as email a few decades ago, we started realizing the need for representing nuances such as a wink or sadness that we could easily convey with tone or color of sound in spoken communication. We invented the smileys, emoticons. They let us represent irony or other basic sentiment, which we could only do through more words before.

Soon after, communication got even faster with the introduction of instant messaging or “chat”. Suddenly, we found ourselves representing gestures or facial expressions with acronyms such as LOL, or, quite recently, SMH (shaking my head). We also went from using two asterisks surrounding a gestural expression (*sigh*) to the infamous “hashtag” – thanks to a few folks that came up with the idea of a “micro blog” in 2006 (technically speaking it wasn’t the founders of Twitter that came up with the hashtag, but the users). I am tempted to #SMH at that, but as a linguist by education I am just seeing (written) language evolve again, which is something that it has always done. There is also no need to despair over alleged degradation of our intricate system of language – if anything, the system is only growing in complexity, never shrinking.

Very recently, a new form of communication was again “invented” with the creation of the “Yo” app. Or was it? The app can do one thing, and one thing only: send someone a “yo”. It attracted venture capital of $1.5m. Now you might be throwing another #SMH at that at first sight, but think about it. When is the last time you sent a simple nod of your head someone’s way? Probably today. Since context is key in understanding human language, i.e. knowing what a conversation is about and knowing what has already been said (linguists call the study of context pragmatics), a nod can be all that’s needed in a certain context that both communication parties share. Yo has seen over a million downloads after only a few days. People love these simple forms of communication! Sometimes it can’t be simple enough.

In parts of India, Africa, and other areas of the so-called third world, people have agreed on ringing patterns to communicate, the so-called “flashing” or “beeping”. Rather than calling a phone with the intention to talk, they let the other party’s phone ring, having agreed on patterns beforehand. Ringing once might mean yes, twice might mean no, thrice “I’m downstairs, come out” etc. (I think I’ve used the latter meaning myself in the past…) As long as the context is known, that might be all you need to convey sometimes. And guess what, it’s free! Ringing does not incur a charge. Something the carriers in those markets do not love at all.

Now let’s get to why I’m telling you all this. I am because I watched Apple present their new communication paradigm with the recent introduction of Apple Watch and realized that they have a gem there that could explode big time and be the reason for the Watch’s success. Apple has come up with a number of new ways to communicate, which they call Digital Touch. Let’s start with the “coolest” one.

The Heartbeat
Since the Watch can measure your heartbeat thanks to its built-in sensors, you can literally send your current heartbeat onto somebody else’s wrist. The Watch cannot only vibrate one way, it has an elaborate vibration system that can generate tangible sensations of different durations, at different areas, of different intensities. What on Earth would I use that for I hear you ask. Maybe to share my heartbeat with my girlfriend. Maybe to share it with a friend after a run (“hey, here’s my pulse, not bad after 5 miles right?”), or while watching a horror movie (“oh man this flick is intense, check out my heartbeat”) or riding a roller-coaster. Or to communicate boredom to a presenter, or relaxedness to my mom before an exam. Or… you’re next with ideas.

The Sketch
The Watch lets me draw on screen and then re-draws that pattern following my exact movements on the recipient’s screen. An effect that made the game Blek successful and addictive recently. What I draw gets re-drawn and then disappears – something that made Snapchat famous and worth $10 billion. Yes, billion. I can draw a quick check mark to send a “yes” or a “got it” to a friend. Or a house to tell dad I’m home. Or a question mark to tell my colleague I have no idea what our boss just meant with that remark on the phone call we’re both on. Or a heart to tell my girlfriend that I’m thinking of her…

The Tap
I can touch the screen at different places. The touches will be shown as “drops” appearing and vanishing on the recipient’s wrist at the same rhythm that I generated them. I can imagine teenagers coming up with an elaborate “language” of touch patterns that only they can decipher. We will witness the birth of micro-languages that small groups agree on and use for communication. Something that I loved doing with friends when I was a kid. This is just a modern day version of the same.

Walkie-Talkie
We all loved doing this as kids, and guess what: the walkie-talkie is seeing a renaissance with apps such as Whatsapp that have been offering it for a while. You tap to record a snippet of voice or your surroundings, then let go to send. As simple as that. Apple added this feature rather late in their recent iOS 8 release, but well, they added it. And it completes the Digital Touch framework.

 

Honestly, as a linguist, I can’t wait for the release of Apple Watch to see all this come to life. As a consumer in love with Apple’s perfection in everything they do (tolerating the occasional disappointments they cause) I want to get my hand on one as soon as I can, hoping that enough others will join this exciting world of communication, too. And Apple will probably take some of these features over to iOS for phones and tablets: no reason why I couldn’t tap or draw on my iPhone and send these volatile communication primitives to a receiving phone.

Apple clearly brought the “cool” back with the Watch announcement and all the things they invented around this “most personal device yet”, such as the pressure-sensitive screen, the “digital crown”, and the numerous sensors onboard. And concocted yet another device I didn’t know I needed. Someone who can do that on a repeated basis deserves nothing but the utmost respect. Don’t you think?

How UPS doesn’t Grok Twitter Support, and 5 Lessons for Doing it Right

I have to share this memorable customer experience I “enjoyed” today for your reading pleasure. It started a few days back, when I tried to change the delivery of a package from Apple I was awaiting.

I knew I wouldn’t be home in person for a signature on the expected delivery date, neither did I want to pre-sign, so I opted for a pick-up option at a UPS facility. On the UPS website I found out I had to register for a (free) account to make a change like that. Fine. I tried to register, but the website kept going in a loop of me registering, then trying to change the delivery option, and it telling me to register to do that, then me registering, then trying to change the delivery option, and it telling me to register when trying to do that, then me… you get the idea. A bug. I had no choice but to contact an agent (which is very expensive for UPS vs me using their website). I opted for Web chat. It worked fast and smoothly. I had the agent change my delivery to a drop it at a pick-up location. So far so good…

Today, I got an SMS alert from Apple (love their service) telling me that “today is the day”. I clicked on Track Shipment to find out where to pickup the package from – ie., which UPS facility. The tracking website told me everything about the journey of my package, how it started in China, then went to Korea, then Alaska, Kentucky, finally Massachusetts. What it failed to tell me? Where to pick the package up. All it said was “A pickup facility in Somerville, MA”. Thanks UPS, that’s helpful. Google tells me there are many UPS locations in Somerville. Which one?

Lesson 1: Fix your data.
It’s needs to be self-explanatory, not force the customer to make unnecessary inbound service requests.

 

I turned to Twitter, asking @UPSHelp for help. Here’s how that journey started:

Uh, yeah… There is. Kinda obvious, no?

Really? I’m contacting you on Twitter and you’re sending me to email? There’s your first mistake, @UPSHelp. I picked Twitter for a reason (simple: I like it – it’s convenient and fast for simple inquiries like this), and you would be perfectly capable of answering my question on this channel.

Lesson 2: Don’t force your customers to switch channels unless they ask for it.

 

Great. Looks like you got it now. So here is my DM:

Wow. “the local center”. Really UPS? You do know the very reason why I contacted you, don’t you? Also, spotted the second mistake? They responded on the public channel, rather than staying on the DM channel that we had just established through mutual following.

Oh, and – pickup times of 3 hours and only on weekdays? That’s almost disrespectful.

Lesson 3: Once on DM, stay on DM.
I didn’t chose Twitter to make my request public – I chose the channel for its convenience, speed, and simplicity.

 

I’m still communicating on DM, but their response again happens on the public channel:

I then realized that the pickup times were actually really bad, and I really wanted this package on a Saturday.

Wow. You just completely blew my mind. 20 minutes and you already completely forgot our conversation? That’s unbelievably silly. I don’t care if they had a change of shifts (note that the agent apparently changed from “SB” to “SO”). It’s just unacceptable to be treated like this. I’m starting to lose patience. Also: again the request to switch to email, even though they have clearly demonstrated they could answer my questions on Twitter.

Lesson 4: Never lose context, never force your customers to repeat themselves.

 

I responded right away, assuming I had their attention. That was naive of me to assume. Of course. I waited 30 minutes for a response. I became impatient.

That may very well be, but what about other facilities nearby? Can the package not be transferred to a facility with better opening hours (assuming those exist in the first place – I don’t know, you tell me)? Please try to solve my problem, not just answer questions.

Lesson 5: Think and help, proactively – don’t just answer questions.

 

That was my last interaction. I am still waiting for a response, hours later.

Sorry @UPSHelp, but you have a lot to learn (and fix) if you want to get this customer service thing right. Need help in doing it? Talk to my company, we are in the business of fixing bad customer service. Meanwhile, you have one more unhappy customer who happens to be a customer service professional that likes to blog.

 

Addendum: I did email them, which they had asked me to do, and they responded (6 hours later). In that email response, they told me that the location actually opens in the morning AND in the afternoon. So the info in their tweet was actually wrong! I’m still sitting here, shaking my head.

The Dawn of the Era of “Human-assisted Machine Service” in Customer Care

For the longest time, we have looked at how to complement human labor in the contact center with computer programs – to reduce costs, provide 24×7 service, and offer quicker access to basic information. IVR systems are the prime example of technology that complements human service by pre-qualifying a caller through some simple questions and routing them to the right agent. What we have been finding until recently is that customers usually preferred the “human touch” over automation. The terms “automation,” “bots,” or ”IVR” tend to come with heavily negative connotations. TV ads have been mocking bad IVR systems and showing off with short wait times to get to live operators. Websites such as gethuman.com have been launched to show consumers how to quickest get to an operator.

As another example of how technology is complementing human labor, let’s look at the agent workplace itself. In the contact center, agents are equipped with Internet access, dedicated knowledge bases, document management systems, CRM systems, and more, to have the answer to the customer’s question ready as quickly as possible.

All of this, however, is slowly changing. Rather than having machines assist humans, we’re slowly entering the era of the reverse: algorithm-based customer service, assisted by humans to put the “finishing touches” on an otherwise increasingly impeccable experience. Virtual assistants on websites are recently gaining in popularity as they expose a more natural interface (conversational, “spoken language” style) to finding information vs. manual search. The younger demographic prefers texting to calling. The older ones seem to catch up and agree. We are now carefully sending a text first to check if it is OK to call. What a remarkable change in behavior and expectations!

Why is this change happening?

First and foremost: the mobility revolution. The introduction of the iPhone in 2007 has truly brought the mass consumer market to realize the power of computer technology if applied to their daily lives. For the first time ever did the Internet, an invention that in its current form came into being around 1989, find its way into a pocket-sized device without compromising the user experience. More and more people realized that with a mobile device of this form factor, they had the world of information and communication literally at their fingertips. It suddenly became cool to be a nerd – a person who understands and can program computers. Consumers now seem to shout ”give me an app, I can look this up quicker than your agents!” at any company they do business with.

Other enhancements came with the iPhone and Apple’s insistence on quality user experiences. Siri, Apple’s speech assistant, is only possible as general-purpose data connectivity and required bandwidths allowed overcoming the restrictions of the PSTN (public switched telephone network) in terms of what sound frequency ranges it submits. Sounds such as “s”, “f”, or “th” differ in frequency ranges that are simply cut off in normal telephony (above 8 kHz). Siri can submit the full range of an utterance recording in high definition, which improves speech recognition accuracy. Cloud computing does its share to quickly respond with a transcription of what was said, of which Siri then applies a semantic analysis to truly “understand” the user – at least to the extent that it can perform the operation the user asked for.

Big Data and today’s capability to automatically farm large quantities of data contribute to making systems such as Siri more and more perfect. The formula is simple: the more context you have, the better you can understand someone. While this is even more true for pragmatic context (i.e. dialog history and “world” knowledge), it is even important in the phonetic domain alone. As an example: try to understand a few words of a spoken conversation by only hearing a second or so of what was said. If you don’t have any context at all, you will realize it’s not that easy to do. However, if you heard the domain and topic the conversation is about, and heard what the immediately neighboring words were, your brain uses deduction and other mental techniques in addition to the pure “hearing”, to understand an utterance.

What does all of this have to do with customer service?

Recently, companies and technologies have emerged that invert the paradigm of machines helping humans become better. There are companies out there in the field of IVR technology that are running call centers of people that do nothing but listen to what people tell IVR systems. They never talk to the callers directly, they are merely jumping in if automated speech recognition cannot tell what a caller said, or isn’t confident enough it understood the caller properly (speech recognition engines always associate a likelihood with a recognition hypothesis). When hearing an utterance, they then either quickly type in what they heard, so that the IVR system can move forward, or they click on predefined data items (the so-called “semantic interpretation” of a verbatim utterance) that are expected in the current dialog step. This is a case of “human-assisted machine service” in the field of customer service that is an amazing testament to the change that is taking place.

After great success on TV’s “Jeopardy,” IBM released Watson to developers to build applications that use Watson’s unique cognitive capabilities in creative new ways. A prime use case for Watson, however, is customer service. When done right, Watson can engage with customers, say through chat on a website, as if the customer was talking to a live person. Watson doesn’t merely bring up Web pages that seem to have the information you are asking for – it answers your question. Something that Google also increasingly does on their core search product (try searching for “who wrote Norwegian Wood,” and Google will answer your question – in addition to showing you relevant websites). Watson goes beyond Google, though, in that it can ask back to narrow down your question, to lead you to the right answer. It can deduce. It can learn. Like a child absorbing everything, or a very astute student. Most importantly: Watson learns from unstructured data, i.e. data expressed in human language such as English. That’s a new level of computing, beyond plain big data analysis.

With Watson, humans again take a step back from the spotlight, and operate “behind the scenes.” They need to feed Watson with information, constantly. Watson doesn’t go out by itself to learn. Watson needs to be fed product brochures, manuals, data sheets, research papers, books, etc. Anything that is relevant for the domain of knowledge Watson is operating in.

This is the emerging new role of humans in customer service: make sure that the data is accurate, but let machines do the “talking” and “serving.” Humans then also step in when that “human touch” is really needed: Not to answer the simple questions, but to mitigate in complex situations, to calm down angry customers, to provide a level of confidence and confidentiality when needed, e.g. in the domain of financial advising.

It’s going to be exciting to see what the limits are.

Natural Language Processing is Entering the Business World

My company uses Concur for travel management, and I am quite happy with their tools. One feature struck my immediate interest, which is their use of natural language processing/understanding (NLU/NLP). Ever since Apple introduced Siri to the mass market, speech assistants have gained public interest. Microsoft came out with Cortana recently. Google has Google Now, even though I wouldn’t necessarily qualify that as an assistant with its typical characteristics (dialog-based, traits of a character, etc.) at this point. However, Siri does not have a text-based interface, and I do occasionally miss that feature.

While I can do a quick restaurant or movie ticket reservation by saying “two tickets for Transformers at AMC tonight”, which fills the fields of the virtual “form” automatically for me (movie title, number of tickets, location, time), I cannot type the same. Would I want to? Heck yes! Natural language is such a natural interface for us humans, much more natural than filling in forms with isolated pieces of information. And well, sometimes I just can’t speak, unless I want to draw everyone’s attention to me in that boring meeting I’m attending…

Concur gives me the following dreaded form that all of us have seen, of which a number of fields are typically irrelevant for me:

What you might have overseen is the text field at the top. That one is a natural language inquiry field. Check this out:

With one simple sentence I gave it all the data it needs to know for my booking, and it is smart enough to assign the values to the respective fields for me. One further click on a button and I can review my flight options. Quite honestly, I experience this as a much more convenient UX than the good old form filling approach.

If you’re a user of Concur, do give it a try! It works just as fine with hotel or car reservations. Natural language understanding, speech OR text-based, is the next big UI, and it’s time to start exploring its use cases. (By the way, it’s great to see businesses deploy consumer-style technology. Not too long ago, this used to happen the other way around, when innovation came from the business world and got introduced to consumers later…)

Why the iWatch could truly be “the next big thing”

Apple’s first wearable device will likely feature health tracking functions that will advance this technology and push adoption, much like what the company accomplished for speech recognition through Siri. While rumors are somewhat contradicting, the device purportedly called iWatch MIGHT have an opto-electronic sensor to measure blood glucose levels constantly. Now if that is true, then I have no doubt that Apple will stay the world’s most valuable company for quite some time. 

Tens of millions of Americans have diabetes today, numbers are growing. It doesn’t look much better across the globe. If I could see my glucose levels on my wrist constantly (right next to the time and my next appointment), without needing to withdraw blood every time I want to measure it (as diabetics do it today), then that would increase my chances of a longer life substantially. I would learn the impact of certain food on my glucose levels much quicker than I realistically can today, and accordingly I could adjust my insulin dosis much more accurately. A quick look at my watch 2 hours after lunch would tell me how those carbs I ate kicked up my glucose levels. At the end of the day, we’re all humans, and humans are lazy. Yes, one can measure glucose levels frequently today, but it takes discipline to do that to the extent one should. 

It would not only benefit diabetics, though. Seeing the impact of that bag of chips or glass of coke on your blood glucose levels, blood oxygen levels and blood pressure, just by looking at your watch, would help ANYONE understand the digestive system better and make us change our self-destructive habits that the food industry has taught us over the decades. The impact of that empowerment on our overall health and the health insurance system as a whole would be tremendous. It could save the public healthcare sector billions. The estimated total economic cost of diagnosed diabetes in 2012 was $245 billion, a 41% increase from 2007.

I don’t know if those rumors about the Apple wearable are true. But I understand why Apple is going in that direction. The move could, once again, disrupt an entire industry and impact our lives the same way the introduction of the iPhone, the first “consumer smartphone”, did in 2007. Time to stock up on Apple stock folks!

6 things I will never understand about the US of A

While I’ve come to slowly appreciate certain peculiarities about the American lifestyle and culture, there is still a number of things I fail to comprehend and adapt to. Probably for life. Here’s my top 6 – in no particular order, really…

1) Americans queing for Starbucks. And Starbucks only. 
There is a queue of maybe 12 coffee cravers in front of the coffee shop with the ubiquitous green coffee logo, but nearly nobody waiting in line right across, even though the competitor serves… Starbucks coffee. Can you really be so stuck in your ways to prefer waiting in line?

2) Parking spots wasting space
Despite all the technological advancements that mostly come to life in this very country, curbside parking is still regulated by posts with coin slots (ofr, the more “advanced” version: posts with credit card slots), one for two spots, with spots so large that two of them could easily fit three vehicles. But no, you can’t make use of that space, as the posts will blink madly at you if you dare try… What a waste of precious parking space!

3) Plastic bags so weak that you need two for an average load
I think one can still claim that Americans by and large don’t care too much about the environment. Take supermarkets. The plastic bags that are used for packing are typically so weak that the staff is forced to use two-in-one! Like, always! Even when just putting eggs and bread in. Talking about waste…

4) Flight attendants showing you how to buckle up
Granted, this is not only an American thing, but… Really? Do you really have to show how to use a seat belt? I’m wanting to approach a flight attendant next time I fly and ask how that buckling up thing worked again – just after they showed everyone. I really am wanting. 

5) Rooms cooled down so much…
…that people start putting heaters under their desks in offices. This is one I’ve come to experience shortly after I started traveling the US. There almost seems to be a reciprocal relationship between outside and inside temperature. Rather than making it warmer inside the hotter it gets outside (obviously at a lower overall level), Americans love to cool down their offices more and more the hotter it gets. Such like when you’re outside you wanna be naked, and wearing a winter coat when you’re inside. I mean think about it. That’s like a movie theater turning up the volume of the movie so much that they decide to hand out headsets to everyone to protect their ears. And when wearing, the volume is suddenly so soft that people start turning up the volume of headset again. And to protect from that volume, …

6) Coffes filled to the rim
So you ask your barista to “leave room for milk”. You’ll get maybe 5mm less coffee than the cup’s rim. You ask him or her to “leave AMPLE room for milk” and they throw in another mm. Ask to “only fill the cup 2 thirds” and they might leave out just enough coffee so you can carry the cup over to the milk counter without spilling, where you end up emptying the rest of the coffee you never wanted into the trash can/bag. Yep. Happening all the time. The smallest size at SB is “tall”, which is 12 (!) ounces. That’s a LOT of coffee…

Can anyone relate? I could probably go on if I thought about it more. But hey. Don’t get me wrong, I still like living here. It’s just that… AAAARGHHH!

Customer Service – Don’t Treat Us Like We’re Still in the 90ies

I witnessed a conversation on my Facebook wall today that obviously caught my attention, as I am working in the customer service technology industry. I want to share it here:

It is obvious how the recent advances in communication technology simply overwhelm some bigger organizations. While in the past they could compensate lack of speed of adoption as technological progress was slower, it is nowadays pretty much impossible to keep up with the Whatsapps, Snapchats and Secrets of this world. But looking at what my friends here discuss – they really don’t demand much. SMS and email are technologies from the stone-age! SMS celebrated its 20-year anniversary in December 2012! And still today, so many years later, businesses are not using the power of this channel for the most basic B2C customer communication.

And phone calls that ask you to call back on a number that isn’t even the one the call is coming from? Or a text that asks you to call back? Why the forced switch of channel, if obviously I chose to have you reach me via SMS?

It quite frankly blows my mind, knowing what is possible if only modern software and cloud solutions were embraced. How much longer will consumers tolerate companies with customer service technology from the 90ies? Well, research shows that they are increasingly taking their business elsewhere already, so hopefully the dinosaurs of the service industry will either go extinct soon, or adapt. Remember, it’s not the strongest that survive, but the “fittest”, meaning those that can adapt…

Pebble – How to Not Over-Architect a Smartwatch

I bought my Pebble in retail a few months ago and have since fallen in love with it. It doesn’t try to replace a smartphone, which is a stupid idea to begin with. The screen of any smartwatch is tiny by definition, and I don’t want to update Facebook, play games or write blog posts on my wrist. I can easily pull out my smartphone for those more complex tasks.

The Pebble does what it advertises it does:

  • Send all notifications from my iPhone’s lock screen to my wrist (the latest release does it with a delay of only 2 seconds)
  • Show CallerID of an incoming call and allow the rejection thereof
  • Show CallerID and content of incoming SMS
  • Show content of Twitter mentions and DMs
  • Show content of other apps like Lync, Facebook, Reminders, etc.

The battery lasts for days, and the accelerometer built into the device turns on backlight with a flick of the wrist. (Though for 2 seconds only, which isn’t enough and unfortunately not configurable.)

With the help of watchface-generator.de I can create my own watchfaces (ie. customized screens) within a minute, incl. date and time information and a custom image – I prefer one of my girlfriend. 🙂

The most useful feature of my Pebble when wearing it during a workday is the display of my current and next calendar appointments. I achieve that through the use of 2 apps. On my iPhone, I first download Smartwatch+ from the App Store. Within that app, I can then install 2 Pebble watchapps:

  1. SmartStatus, which displays date, time, current temperature, phone battery percentage, next calendar appointment (or current one if I’m currently in one – which I don’t like, as I prefer to see the next one on my calendar)
  2. Smartwatch+ (same name as the iOS app, which is confusing at first), which gives me comprehensive weather info, a full browsable list of my appointments, and my reminders. Furthermore it does stocks, music, camera (huh?), GPS, and custom HTTP requests to control things like your smart home… Nice, and all I’d really ever need (I think… I’ll let Apple convince me otherwise)

I find myself typically using Smartwatch’s calendar view on my wrist during a workday, as I can see all following appointments at a glance, plus the time. After work, I switch over to seeing my girlfriend smile at me and tell me time & date. 🙂

The Bluetooth connection is still slightly buggy, which can be annoying. To get an update of weather or calendar data I occasionally have to do the following in the Smartwatch+ iOS app, in order to re-establish the connection:

  • Open Smartwatch+ and hit Disconnect
  • Wait 10 seconds
  • Connect again
  • Wait a few seconds
  • Press the right top button on the pebble to re-sync

I can live with that, however, hope that they’ll fix it eventually.

I am happy with the purchase and don’t think I’ll ever need more from a smart watch. But as I said, once Apple releases their iWatch, I’ll be happily corrected.

How New Technologies Necessitate Customer Care Collaboration

Back in September, I wrote about the need to break up organizational silos in order to improve overall customer service. The rise of the smartphone and the advent of social networks simply necessitate collaboration among different teams and departments to jointly work towards the goal of a consistent customer experience.
 
 Read more here: http://www.icmi.com/Resources/Technology/2013/09/How-New-Technologies-Necessitate-Customer-Care-Collaboration