Search

Krishanu Te Chattopadhyay

from Sunnyvale, CA
Age ~53

Krishanu Chattopadhyay Phones & Addresses

  • 590 Saco Ter, Sunnyvale, CA 94089 (408) 541-1503
  • 1243 Vicente Dr, Sunnyvale, CA 94086
  • 1065 Greco Ave, Sunnyvale, CA 94087
  • 752 Marlin Ave, Foster City, CA 94404 (650) 357-1320
  • 1121 Foster City Blvd, Foster City, CA 94404 (650) 357-1320
  • 752 Marlin Ave #2, San Mateo, CA 94404 (650) 357-1320
  • 1121 Foster City Blvd #2, San Mateo, CA 94404 (650) 357-1320
  • 1200 Dale Ave, Mountain View, CA 94040
  • Santa Clara, CA
  • Mineola, NY
  • 590 Saco Ter, Sunnyvale, CA 94089

Work

Position: Financial Professional

Education

Degree: Associate degree or higher

Resumes

Resumes

Krishanu Chattopadhyay Photo 1

Krishanu Chattopadhyay

View page
Location:
590 Saco Ter, Sunnyvale, CA 94089
Industry:
Computer Software
Work:
Netapp
Engineering Manager
Skills:
Software Development
Mobile Devices
Embedded Systems
Android
Cryptography
Software Design
C
Linux
C++
Object Oriented Design
Java
Embedded Software
Hadoop
Python
Big Data
Agile Methodologies
Mysql
Apache
Representational State Transfer
Xml
Eclipse
Intellij
Vxworks
Cdma
Languages:
English
Krishanu Chattopadhyay Photo 2

Financial Advisor

View page
Work:

Financial Advisor
Krishanu Chattopadhyay Photo 3

Sunnyvale, California

View page
Location:
Sunnyvale, CA
Work:

Sunnyvale, California

Publications

Us Patents

Content Navigation And Selection In An Eyes-Free Mode

View page
US Patent:
20140215339, Jul 31, 2014
Filed:
Jan 28, 2013
Appl. No.:
13/751940
Inventors:
barnesandnoble.com llc - , US
Krishanu Chattopadhyay - Sunnyvale CA, US
Saj Shetty - Fremont CA, US
Assignee:
barnesandnoble.com llc - New York NY
International Classification:
G06F 3/0488
G06F 3/0482
G06F 3/16
US Classification:
715727, 715863, 715810
Abstract:
Techniques are disclosed for facilitating the use of an electronic device having a user interface that is sensitive to a user's gestures. An “eyes-free” mode is provided in which the user can control the device without looking at the device display. Once the eyes-free mode is engaged, the user can control the device by performing gestures that are detected by the device, wherein a gesture is interpreted by the device without regard to a specific location where the gesture is made. The eyes-free mode can be used, for example, to look up a dictionary definition of a word in an e-book or to navigate through and select options from a hierarchical menu of settings on a tablet. The eyes-free mode advantageously allows a user to interact with the user interface in situations where the user has little or no ability to establish concentrated visual contact with the device display.

Context Based Gesture Delineation For User Interaction In Eyes-Free Mode

View page
US Patent:
20140215340, Jul 31, 2014
Filed:
Jan 28, 2013
Appl. No.:
13/751951
Inventors:
Barnesandnoble.com llc - , US
Krishanu Chattopadhyay - Sunnyvale CA, US
Douglas Klein - Los Altos Hills CA, US
Assignee:
barnesandnoble.com llc - New York NY
International Classification:
G06F 3/0488
G06F 3/0482
G06F 3/16
US Classification:
715727, 715863, 715810
Abstract:
Techniques are disclosed for facilitating the use of an electronic device having a user interface that is sensitive to a user's gestures. An “eyes-free” mode is provided in which the user can control the device without looking at the device display. Once the eyes-free mode is engaged, the user can control the device by performing gestures that are detected by the device, wherein a gesture is interpreted by the device without regard to a specific location where the gesture is made. The eyes-free mode can be used, for example, to look up a dictionary definition of a word in an e-book or to navigate through and select options from a hierarchical menu of settings on a tablet. The eyes-free mode advantageously allows a user to interact with the user interface in situations where the user has little or no ability to establish concentrated visual contact with the device display.
Krishanu Te Chattopadhyay from Sunnyvale, CA, age ~53 Get Report