1. AnimalPak200's Avatar
    Hey guys, a few days ago I was reading a thread discussing the tid bits found on the leaked 10.3 OS that hint at a much improved "Intelligent Assistant" (IA). As expected, the discussion mostly involved comparing IA to its counterparts, Siri, Cortana and Google Now, and whether or not such 'assistants' are practical or simply gimmicks. This got me thinking... why are these mostly voice-recognition-based assistants discarded as gimmicks? Do most people really prefer to convey commands using their fingers instead of talking?

    With BlackBerry using the name "Intelligent Assistant", it really makes you think... what would a person expect of an assistant? Would you really expect them to only act in a direct response to a spoken request? No way!.. If I had a real personal assistant, I would expect them to use all their faculties and resources to look at my data (communications, calendars, frequent contacts, etc.), observe and learn my habits, routines and preferences, and combine it all to take the initiative and predict my most likely needs. In other words, my assistant should not be limited by my own "intelligence"/forgetfulness/lack of organization, etc. It should pick up my slack and 'save me' from being the usual human.

    As it stands, all these smartphone assistants rely on the user to explicitly trigger them (press the home button, say 'Google Now', press the microphone symbol, etc.) If you don't trigger them, they just sit there unused, caged inside their shiny glass, semiconductor and metal (or plastic?--I'm looking at you Samsung) box. Why?!

    They have many other 'faculties' aside from their microphone, such as a clock, gyroscopes, accelerometers, magnetic sensors, ambient light sensors, GPS sensors, etc, which are being ignored by the assistant. The assistant also has access to all our communications and scheduling data. If BlackBerry really wants to make this an 'Intelligent Assitant', differentiated from the maturing voice-recognition-based offerings from Google/Apple, it needs to incorporate all these to offer a proactive semi-independent assistant.

    Here are a few examples, starting from very simple to slightly ambitious (but completely possible).

    Auto-Dial:

    Whether we realize it or not, most of us do certain things in very mechanically-predictable and repeatable patterns. We get into the 'habit' of ___. How we handle our smartphones for specific tasks is definitely one of those things. Usually we rest our smartphones either flat on a table, inside a dark pocket, or inside a holster. To make a call, you pick it up or remove it from the pocket/holster, move it to a certain intermediate location to dial, and raise it up to your head for the call.

    Each of those states can be detected using current smartphone sensors. When the device is on a table, it is flat and motionless (detectable using gyroscope and accelerometer). If it was in a pocket/holster, the ambient light sensor would detect low light and/or the sleeper magnet sensor would be activated. Once your remove it form said starting position, the sensors would change, triggering a monitoring of the gyroscope and accelerometer as your move the phone towards your head, at which point (a) the motion suddenly stops (detectable), (b) device is held vertically upright (detectable) and (c) the proximity sensor is triggered (detectable).

    Using an appropriate machine-learning technique (neural networks, Kalman filters, etc.), each user-specific sensor variation sequence could be associated with the user picking up the phone and putting it up to their head. At which point the Intelligent Assistant merely prompts "DIAL" through the phone earpiece (not the speaker phone), the user simply responds "Jessica" or "234-0567", and waits for the call to go through. No button press, no awkward "Siri,.. call Jess", no embarrassing back and forth with your phone publicized through the speaker phone. You just simply pick up your phone, put it to your head and say the contact name or phone number.


    Pre-emptive traffic/weather/context alert/report based on calendar appointment

    It's safe to say that a smart-phone's calendar is perhaps one of the most consistently used features. If you look at each entry, it really provides the opportunity to extract a wealth of information about something that the user will be doing in the future. Imagine if instead of just reminding you of the fact that you have an upcoming event. The IA would extract the calendar entry information including title/description, time, day, location, etc, poll the internet for weather and traffic data, and use it to alert you of various scenarios. For example, it knows you have an appointment at the dentist in an hour. Using your current location (GPS), it determines the shortest/fastest routes, checks for any adverse traffic conditions, calculates ETA, and thus modifies the 'reminder' time accordingly. When it displays the reminder, it alerts you to the traffic conditions and provides the ETA All without the user having to bring up a mapping application to check the traffic, only to realize that they are already late!

    Similarly, if you have scheduled an activity that an internet search reveals to be 'outdoors' (i.e. company picnick, kids football game, skiing trip, etc.) the IA will pre-emptively check the weather, and alert you to any adverse conditions (too hot, likely to rain, etc.) prior to the event so that you can dress appropriately, remember to take an umbrella, etc.

    This could be further extended to any event-driven information retrieval task, such as a "while you're in Rome, you might be interested in these top attractions" type of notification, in advance of a scheduled entry titled "Trip to Rome". Of course, it would first ask you if the trip is for business or pleasure, and tailor accordingly.


    Incomplete Data Augmentation

    OK, so the previous calendar-based feature sounds good, but I'm sure many users don't always enter a location for their scheduled events. The IA should be able to learn where it is you do the things you do! Take the previous dentist appointment example. I initially added a dentist appointment entry to my calendar, but left out the location because I didn't really have time and I know where it is anyway (very human-ish of me). The time for the appointment comes, I travel to the dentist office location, and remain there for approximately the amount of time corresponding to the appointment. The IA, using its location services, can sense the static location correlating to the appointment period. Just to double check, it can then search the internet for dentists around that area, and add the matching address to my 'Dentist' calendar entry and a global list of 'personalized environment' locations (i.e. places you do things at). This way, the next time you schedule a dentist appointment, the IA will augment the calendar entry with the independently-learned address, and thereafter be able to provide you with smart traffic information as detailed in the previous section. The IA should be able to pick up the slack for the user in order to still be 'intelligent'.


    There are many other examples, but this post is already long as it is. I guess my point is that, going forward, any 'Intelligent Assistant' should not just simply wait for a forgetful, imperfect human to request assistance. It should be proactive in leveraging its wide array of sensors and data connectivity to augment, enhance and learn from the data we provide it, learn our routines and habits, predict our needs, and help us in ways we didn't even realized (or forgot that) we needed.

    Until then, I guess I'll keep using my fingers.
    beoandlun, L_ieven and Vuneu like this.
    05-26-14 01:26 PM
  2. stabstabdie's Avatar
    That would use up a tremendous amount of resources that the phone probably doesn't have. Perhaps that is another generation of technology away.
    For now I'd be happy with no more wifi connection issues and google play services.
    05-26-14 02:17 PM
  3. early2bed's Avatar
    I'm sure Google is working on it. This is a project that requires an ecosystem of content such as real-time traffic data that Google collects. It also requires a ton of R&D investment that only the big players can afford to make. I think that Blackberry is going to be in catch-up made in this area.

    Also, for every convenience benefit of having your smartphone anticipate what you want, there is usually a hassle factor. Right now, I'm trying to figure out why events from Facebook acquaintances are showing up in my cloud-based calendar. Did I enable that on Facebook or one of my calendar apps? On the PC or the smartphone? Is it ever going to be smart enough to know that I won't be attending an event on the other side of the country today?
    05-26-14 02:57 PM
  4. systemvolker's Avatar
    Op everything you said can happen. The question is, have you found the developer to do and can do this?
    Did you know what happened to "Sayit" app?

    Posted via CB10
    05-26-14 03:22 PM
  5. AnimalPak200's Avatar
    Maybe it is a generation away, but BlackBerry pretty much already missed this generation of device technology, which is exactly why they need to look way ahead instead of trying to retrace their competitors steps.

    Regarding the need for an ecosystem... all you really need is the capability to search the internet and parse through a few keywords. This isn't really limited to Google, although you could technically use Google searches.

    As for resource intensive... maybe, maybe not so much. Sensor data can be obtained via interrupts (rather than periodic polling). Sure, you may need some background processing but these devices are overpowered and sit idle for quite a bit of the day.

    Now the potential to be annoying is a big deal for sure. I'm with you with the horror story that is the BB10 contact list. Sounded good in theory, but a disaster in practice. That could be true for this as well... although the problem with the contact list is that most expected to function like an old paper pad, while an intelligent assistant ... well, I wouldn't be as angry with it if it didn't measure up to a capable secretary.

    Posted via CB10
    05-26-14 04:55 PM
  6. mapsonburt's Avatar
    I'm not going to **** over your very good ideas. You are right about the interrupts. Only the lamest coder uses polling. Most of your ideas are eminently implementable today. It would be great if BlackBerry was working on them. Thanks!

    Posted via CB10
    05-27-14 06:47 AM
  7. Guignards's Avatar
    Hey guys, a few days ago I was reading a thread discussing the tid bits found on the leaked 10.3 OS that hint at a much improved "Intelligent Assistant" (IA). As expected, the discussion mostly involved comparing IA to its counterparts, Siri, Cortana and Google Now, and whether or not such 'assistants' are practical or simply gimmicks. This got me thinking... why are these mostly voice-recognition-based assistants discarded as gimmicks? Do most people really prefer to convey commands using their fingers instead of talking?

    With BlackBerry using the name "Intelligent Assistant", it really makes you think... what would a person expect of an assistant? Would you really expect them to only act in a direct response to a spoken request? No way!.. If I had a real personal assistant, I would expect them to use all their faculties and resources to look at my data (communications, calendars, frequent contacts, etc.), observe and learn my habits, routines and preferences, and combine it all to take the initiative and predict my most likely needs. In other words, my assistant should not be limited by my own "intelligence"/forgetfulness/lack of organization, etc. It should pick up my slack and 'save me' from being the usual human.

    As it stands, all these smartphone assistants rely on the user to explicitly trigger them (press the home button, say 'Google Now', press the microphone symbol, etc.) If you don't trigger them, they just sit there unused, caged inside their shiny glass, semiconductor and metal (or plastic?--I'm looking at you Samsung) box. Why?!

    They have many other 'faculties' aside from their microphone, such as a clock, gyroscopes, accelerometers, magnetic sensors, ambient light sensors, GPS sensors, etc, which are being ignored by the assistant. The assistant also has access to all our communications and scheduling data. If BlackBerry really wants to make this an 'Intelligent Assitant', differentiated from the maturing voice-recognition-based offerings from Google/Apple, it needs to incorporate all these to offer a proactive semi-independent assistant.

    Here are a few examples, starting from very simple to slightly ambitious (but completely possible).

    Auto-Dial:

    Whether we realize it or not, most of us do certain things in very mechanically-predictable and repeatable patterns. We get into the 'habit' of ___. How we handle our smartphones for specific tasks is definitely one of those things. Usually we rest our smartphones either flat on a table, inside a dark pocket, or inside a holster. To make a call, you pick it up or remove it from the pocket/holster, move it to a certain intermediate location to dial, and raise it up to your head for the call.

    Each of those states can be detected using current smartphone sensors. When the device is on a table, it is flat and motionless (detectable using gyroscope and accelerometer). If it was in a pocket/holster, the ambient light sensor would detect low light and/or the sleeper magnet sensor would be activated. Once your remove it form said starting position, the sensors would change, triggering a monitoring of the gyroscope and accelerometer as your move the phone towards your head, at which point (a) the motion suddenly stops (detectable), (b) device is held vertically upright (detectable) and (c) the proximity sensor is triggered (detectable).

    Using an appropriate machine-learning technique (neural networks, Kalman filters, etc.), each user-specific sensor variation sequence could be associated with the user picking up the phone and putting it up to their head. At which point the Intelligent Assistant merely prompts "DIAL" through the phone earpiece (not the speaker phone), the user simply responds "Jessica" or "234-0567", and waits for the call to go through. No button press, no awkward "Siri,.. call Jess", no embarrassing back and forth with your phone publicized through the speaker phone. You just simply pick up your phone, put it to your head and say the contact name or phone number.


    Pre-emptive traffic/weather/context alert/report based on calendar appointment

    It's safe to say that a smart-phone's calendar is perhaps one of the most consistently used features. If you look at each entry, it really provides the opportunity to extract a wealth of information about something that the user will be doing in the future. Imagine if instead of just reminding you of the fact that you have an upcoming event. The IA would extract the calendar entry information including title/description, time, day, location, etc, poll the internet for weather and traffic data, and use it to alert you of various scenarios. For example, it knows you have an appointment at the dentist in an hour. Using your current location (GPS), it determines the shortest/fastest routes, checks for any adverse traffic conditions, calculates ETA, and thus modifies the 'reminder' time accordingly. When it displays the reminder, it alerts you to the traffic conditions and provides the ETA All without the user having to bring up a mapping application to check the traffic, only to realize that they are already late!

    Similarly, if you have scheduled an activity that an internet search reveals to be 'outdoors' (i.e. company picnick, kids football game, skiing trip, etc.) the IA will pre-emptively check the weather, and alert you to any adverse conditions (too hot, likely to rain, etc.) prior to the event so that you can dress appropriately, remember to take an umbrella, etc.

    This could be further extended to any event-driven information retrieval task, such as a "while you're in Rome, you might be interested in these top attractions" type of notification, in advance of a scheduled entry titled "Trip to Rome". Of course, it would first ask you if the trip is for business or pleasure, and tailor accordingly.


    Incomplete Data Augmentation

    OK, so the previous calendar-based feature sounds good, but I'm sure many users don't always enter a location for their scheduled events. The IA should be able to learn where it is you do the things you do! Take the previous dentist appointment example. I initially added a dentist appointment entry to my calendar, but left out the location because I didn't really have time and I know where it is anyway (very human-ish of me). The time for the appointment comes, I travel to the dentist office location, and remain there for approximately the amount of time corresponding to the appointment. The IA, using its location services, can sense the static location correlating to the appointment period. Just to double check, it can then search the internet for dentists around that area, and add the matching address to my 'Dentist' calendar entry and a global list of 'personalized environment' locations (i.e. places you do things at). This way, the next time you schedule a dentist appointment, the IA will augment the calendar entry with the independently-learned address, and thereafter be able to provide you with smart traffic information as detailed in the previous section. The IA should be able to pick up the slack for the user in order to still be 'intelligent'.


    There are many other examples, but this post is already long as it is. I guess my point is that, going forward, any 'Intelligent Assistant' should not just simply wait for a forgetful, imperfect human to request assistance. It should be proactive in leveraging its wide array of sensors and data connectivity to augment, enhance and learn from the data we provide it, learn our routines and habits, predict our needs, and help us in ways we didn't even realized (or forgot that) we needed.

    Until then, I guess I'll keep using my fingers.
    .. I like you. You've got some good ideas.

    Posted via CB10
    08-15-14 12:15 AM
  8. IFCArea51's Avatar
    +1

    Posted via Q10
    08-15-14 01:31 AM

Similar Threads

  1. Anxiously waiting for 10.3 leak update!
    By danieljc in forum Rehab & Off-Topic Lounge
    Replies: 181
    Last Post: 05-30-14, 08:36 AM
  2. 10.3 no music files found...
    By SkopiX in forum BB10 Leaked/Beta OS
    Replies: 5
    Last Post: 05-28-14, 05:43 AM
  3. 10.2.1.2947
    By UNAVISION in forum BB10 Leaked/Beta OS
    Replies: 3
    Last Post: 05-26-14, 09:01 AM
  4. Use os 10.2.1.2102
    By Devin Nathaniel in forum Android Apps (Amazon Store & APK Files)
    Replies: 10
    Last Post: 05-26-14, 08:54 AM
  5. We take a look at MACE tower defense for BlackBerry 10
    By CrackBerry News in forum CrackBerry.com News Discussion
    Replies: 0
    Last Post: 05-26-14, 06:40 AM
LINK TO POST COPIED TO CLIPBOARD