Tag Archives: GHC19

Achieving Enterprise Agility at the Retail Edge

At any conference, you start to feel fatigued near the end of the day. Between the sprawling session locations, the constant engagement, and the sometimes overwhelming number of people and places to navigate, attention begins to wan by the last sessions you attend. Presenters in the unfortunate end-of-day time slots are between attendees and dinner. I was pleasantly surprised to be reinvigorated after attending my second to last session of the day. Presenters Connie Yu and Joana Cruz, of Target, managed to bring humor, excitement, and energy to their topic, titled: Achieve Enterprise Agility at Edge Locations by adopting Kubernetes, Spinnaker and Unimatrix.

My background is in software defined infrastructure orchestration at scale, both in a data center environment, as well as in my company’s private cloud deployment. I was curious to see this type of orchestration in use by a retailer to manage their in store deployments. Connie and Joana structured their session as a humorous conversation on how the stores could adopt supply chain methodologies to improve operations. By doing so, Target went from delivering software to 1850 stores just 3 times a year to multiple times daily. How? Let’s dive in!

Target’s Strategy:

Have you ever installed an operating system and all of the software you want to use on it from scratch? Most consumers buy an off the shelf system that has already been configured for their baseline needs. In an enterprise environment, IT is (usually) responsible for installing and configuring the “stack” — the operating system and all necessary software, in addition to modifying settings to secure and optimize the system, and they have to do this for thousands of systems, not just one. If you had to hand build thousands of systems, given your experience having done it just for one, then you know, it can be tedious, and error-prone work. The time spent to automate the process is well worth the investment as it saves countless hours and makes deployment repeatable, predictable, and scalable. To do this, you need the right tools, and you also need to take the time to consider the architecture to ensure it will meet the requirements.

There are unique infrastructure constraints in a retail store setting. The primary compute resources are thin clients, which have less I/O, Memory, Storage and CPU than you would find in a traditional data center. For this reason, some of the conventional architectures, like server:client segmentation, have to be redesigned to run at the edge. Target presented three primary components of their edge computing solution.

Key components of Target’s edge computing strategy

Microservices are lightweight and loosely coupled which makes the architecture a good fit for distributed computing with limited resources, as is the case in the edge retail environment. Containers allow application dependencies, packages, and software to be fully encapsulated so that they run in an isolated environment side by side with other containers — a logical choice for consolidating services on your infrastructure without disturbing functionality. With microservices, they are typically implemented using REST which often leverages HTTP, and securing all of the data flowing between the services on different endpoints is crucial to maintaining security — no store wants to become the next data breach cautionary tale.


Orchestration, in brief, is the practice of automating the configuration and management of your infrastructure. Whether you have three systems or thousands, managing them is far easier when you have the right tools. Target chose the open source orchestration platform Kubernetes, as their orchestration agent.

Using Kubernetes for orchestration

The primary reasons cited for this choice were: self healing, state maintenance, and high availability. Often, the task of running and scheduling of workloads is segmented onto two different servers. In this case, Target containerized the applications and co-located them on the same host and had a cluster of servers that provided the functionality, thereby achieving consolidation and redundancy.


The deployment tool of choice for Target was Spinnaker. I had never heard of it before. If you’re like me you had to Google it. I’ll save you the trouble, here is the Cliff’s Notes version: It is an open source tool that supports multiple cloud providers and integrates with many existing orchestration tools (like Kubernetes).

Using Spinnaker to deploy

Spinnaker uses the concepts of applications, clusters, and server groups to construct and manage deployment piplines. In short, it allows you to automate which target devices you want to install your software on.

Edge Cloud

One of the points Connie and Joana mentioned was that Target had few corproate datacenters, but 1,855 stores all with their own infrastructure. They wanted to manage each store like its own cloud, and so they needed an edge cloud provider.

Target named their edge cloud provider Unimatrix, which is a nod to Star Trek. Unimatrix commands and controls all of the Kubernetes deployments at the edge.

Edge Cloud Provider: Unimatrix

Unimatrix reuses the existing Kubernetes API, and allows deployment by pilot, mass roll out, or even by store regions. Maintaining the API contract and only adding additional microservices to augment custom requirements allowed Target to continue using the existing Spinnaker and Kubernetes integration. An additional component is the Unimatrix agent. The agent takes the role of the operator. It applies the application spec to the Kubernetes clusters. As the spec is applied, it reports back to Unimatrix, which syncs the state to Spinnaker so the developer can see the deployment progress. You can read more in depth about Unimatrix on the Target blog.

Demoing the Use case: Deploy a promotion

Now that we know all of the components of the system, we can see the process in action.

Process flow: deploying a promotion
  1. All promotion change features implemented as microservices
  2. Push code to Git
  3. Security tools triggered to ensure
  4. Build process ensues
  5. Code goes into docker registry
  6. The Spinnaker pipeline is triggered, and pushes the application spec to the Unimatrix central server
  7. The Unimatrix deployment server begins to sync application spec to the agents in the defined pipelines

As all of this is happening, the state of the deployment is maintained and can be visualized in Spinnaker:

Deployment progress in Spinnaker UI — the green is complete, red is still in progress

And that is how it’s done!


So their teams did all of that work. What was the result?

  • On boarded > 90 applications in the first 6 months; which included IoT platforms, new microservices, and video machine learning capabilities.
  • Running in production at Target edge locations

How did they accomplish this success? The answer — defining very clear objectives from day one:

  1. The ability to deploy to stores in a targeted fashion (pun intended)
  2. Enable application developers to do the deployments
  3. Have clear measurements of progress
  4. Reusing tools allowed developers to rapidly adopt the technology
  5. Flexibility during the solution building — Target accelerated development by using open source, and created custom solutions only when necessary

Not to be biased, but…this was my FAVORITE GHC session. I loved the humorous exchanges between Joana and Connie, they were well rehearsed, knew the material, the flow was logically presented and really took the audience on the journey of WHY developing this technology was necessary, how they implemented the solution, and the return on the investment. Also, might I add, the speakers had the best red tennis shoes, very on brand.

Better code with inner source

I was a member of the Software track committee for GHC, which meant that by attending the conference in person, I was also responsible for chairing a session. Session chairs arrive in the room prior to the start of the presentation and ensure that the presenters have arrived, there are no technical difficulties with A/V equipment, the timer is set up and ready to go, and the session starts and ends on time. The session that I chaired was titled “Better Together: The Inner Source Journey.” Presenters Aliza Carpio and Rocio Montes shared the findings of Intuit’s case study detailing its lessons from embarking on an inner source journey.

By now, you’re probably asking yourself, “What is inner source?” In brief, it is leveraging open source software development practices within your organization. Having been on teams that develop with and without these software practices, I can personally attest to the value of institutionalizing them. Not everyone will want to adopt new ways of working, some will have to be dragged kicking and screaming. Every journey starts with a champion(s), Aliza and Rocio are those brave individuals in this story.

If you work for a company with a large number of developers aligned with different business units, who have competing priorities, and are dispersed geographically, you are painfully familiar with the silos this can create. The challenges Aliza outlined for Intuit were:

  • Code was owned (not stewarded) by teams and individuals, which led to bottlenecks and long wait times for code changes
  • The waiting impacted work/life balance and created a culture of heroics to meet deadlines
  • Repositories lacked documentation, or the documentation was only good enough to be used by the people who already owned the code
  • Varying standards made it difficult to on board new members

So how do you ensure people can work together across the globe and still produce quality software? Rocio discussed three simple criteria to start. Make sure that repositories are:

  • Discover-able – Can people find it?
  • Contribute-able – Once they have found it, can they incorporate their changes?
  • Compose-able – Are there processes in place to ensure that new changes are non breaking?

This means that people, process, and technology changes must be instituted.


  1. Make sure there is ONE set of unified guidelines, including
    1. A standard structure for repositories
    2. A minimum set of documents to detail contribution guidelines
    3. Rules of engagement — how do you engage the stewards of the repository?
    4. Templates for issues and pull requests that auto-populate with the expected information
  2. Use containers for local development — it minimizes the time to onboard new developers by fully encapsulating the working environment (packages, software versions, etc.)
  3. Continuous Integration/ Continuous Deployment (CICD) must be a part of the process
    1. Pull Request (PR) Builds give contributors and reviewers the opportunity to review tests and coverage to determine quality


  1. See every PR as a mentorship opportunity!
  2. If it does not already exist, a “trusted committer” role, held by the group of developers who have the knowledge to do code reviews, and will be responsible for code mergers and reviewers , should be created
  3. Hold code review training to ensure that consistency and quality is maintained for each PR
  4. Make the trusted committer role a rotating on call so the burden is dispersed and not only one person is responsible — those who are frequent committers will eventually become trusted committers.
  5. Have service level agreements (SLAs) for PR reviews, merge and release; declare and define the timelines so external contributors know what to expect and can plan accordingly
  6. Scale efforts by identifying passionate advocates and leaders at each site (by location) to ensure there is someone physically present to champion the transition, and to assist with the technical implementation
  7. Identify repositories and teams that provide foundational capabilities to the organization, those that are a dependency for many other teams, and bring them to inner source first.
  8. Establish rewards and recognition — work with leaders to make the transition a part of each engineer’s goals


While this was not explicitly mentioned during the session, I want to take the liberty of calling out the technology behind the inner source transition. These are observations based on the internal tools and processes in place at my company to support developers so that tooling does not become an encumbrance.

  • Secure the necessary support and funding to internally or externally host the tools to enable the inner source journey
  • Work with IT to identify the CICD pipeline, revision control, and ticketing systems you intend to use; make certain they are interoperable, and integrate them as an enterprise offering.
  • If Agile is part of your Software Development Lifecycle, add additional integrated tools to support the work methodology
  • Allow adoption of the tools to be independent — some teams may be more mature, so they can adopt the entire suite of tools, whereas others may need to start with just revision control.
  • Maintain training collateral for the suite of tools, but have a means of requesting assistance with on boarding
    • Regular office hours where people can dial in for live assistance
    • A means of submitting tickets or requesting help

Tools that I have used in this space are:

*Tool used by Intuit

Back to the session.

What steps do you have to take to start this journey at your company?

  1. Get support from leaders (Vice President, Director, Architect, etc.) and identify partnerships
    1. If possible make these connections in person
  2. Research and beautify the standards; the unified standards; guidelines, documentation, rules should be hosted in a repository as well so that they can be treated as code
  3. Identify teams who will be models for the transition to inner source and work with them to bring other teams along
  4. Conduct workshops to teach teams how to live in the culture

Preparing to be a champion

Moving to inner source is not just a technology adoption, it is a change management journey. To champion the transition, you will need to be able to articulate the return on investment and the business value. Aliza mentioned multiple resources to help prepare for making the case:

You must be a storyteller. Take the time to craft your message so you can be effective at being a catalyst for change!

Helping drive success

When people are new to inner source, it can be overwhelming. Intuit created a “good first issues” site that highlights filed issues from various repositories that are great introductory problems to solve for a newbie. Projects have their issues highlighted on the site by labeling their issues with the appropriate tags, and have well documented contribution guidelines. This also gives engineers an opportunity to gain new skills by working on new projects.

Creating a visible badge for taking the training, becoming a contributor, and championing inner source helps foster pride in the transition.

Creating pride in participation

The final result.

Intuit is one year into their journey and are already seeing the benefits. It is not enough to have anecdotal stories, you want to measure the value added by the transition to inner source, through measurement, analysis and presentation of the results.

Measure and analyze data

Now you’re ready to embark on your journey! Safe travels.

IoT For Social Good

How many times have you encountered technology in the real world, and wondered what the purpose was? It seems to have been deployed just because it could be. I know I have. The Internet of Things (IoT) can have large societal benefits, when applied purposefully. The GHC session presented by PwC strategy consultant Karla Mendez, entitled Creating Shared Value with IoT Solutions shone a light on companies who incorporate the idea of Shared Value into their technology solutions.

The idea of Shared Value is characterized by Harvard business school professors Michael E. Porter Mark R. Kramer as “a new way for companies to achieve economic success” by finding business value in social problems [1].

Business value in social problems.

Karla, who took the Shared Value course at Harvard, presented several examples of businesses using IoT to provide a societal need. When businesses position their products as providing Shared Value, a cynic can perceive this as corporate branding that hinges on pandering. Some ways that differentiate companies who have Shared Value at their core are (among other things):

  • Placing social good on par with profit as a core value
  • Make those values publicly accessible and transparent
  • Weaving those values into the thread of all aspects of operations, as an example: sourcing materials only from suppliers that align with that (social good) value

Farm in a Box

I was late to the session, so I didn’t get to hear from the presenter directly about this company. I did find them online and am glad that I did! They market themselves as an “off-grid toolkit for sustainable agriculture.” You can read more about them here. They have their sustainable development goals right on the front of their website!

Farm in a box sustainable development goals

Intelligent White Boards

An IoT enabled White Board can be deployed in classrooms, and as a teacher writes on the board, it is synced with a corresponding student display, and also allows the student to review the lessons outside of the classroom. Another added benefit is that the teacher gets real-time feedback — doing a quick pop quiz to gauge understanding, can help the teacher adjust when a given topic is not well understood and helps students stay on track for learning. This improves the quality of education. EdTech companies are on the rise, and use the data collected on improved student learning to demonstrate their value. The specific solution referenced in this example, was developed by SMART.

Smart Lighting

The value proposition for smart lighting is that if we can reduce energy use and consumption, we reduce our carbon footprint. Lighting fixtures with IoT sensors can turn off automatically, report when there is an issue (using excess energy). This also reduces energy cost. In the hospitality industry, installing IoT enabled lighting fixtures in hotel rooms can save an estimated $300 per year, per room; the same could be done with heating and cooling. One company providing IoT enabled smart lighting is the presenter’s company: PwC Connected solutions.

Homeless Tracking

There are more than half a million homeless Americans, and only 65% of them are in shelters. This leaves a quarter million people on the streets without resources. Where can IoT help? If we could assess where homeless people are in order to target locations for shelters and other resources, the government could better plan for and reduce homelessness. One solution in this space is digital kiosks. Kiosks could help homeless individuals search for and find resources, like housing. In the Phoenix area, I have seen donation meters. There was not a specific company identified as providing a solution in this space, but with a quick internet search I found that cities like New York and San Francisco are implementing innovative technology to track and align services for their homeless population. I also found an interesting article about LinkNYC, a company that converted old payphones to internet kiosks, but the results were not as rosy as originally intended.

I work for a corporation, and there is an entire branch of the company devoted to social responsibility. It is my hope that as a society, when we learn how our actions impact the world around us, we evolve the way we do business to eliminate negative impacts, not purely for the profit motive, but, because it is the right thing to do.

Recommended Reading:

A few articles I came across that show technology for societal good at work, and one illustrating the unintended consequences.

Smart Contracts using blockchain

One of the emerging trends in computing that everyone talks about, but can seldom articulate, is blockchain. What is that, you ask? If you’re like me, and haven’t done a deep dive, you’ve likely heard of it because of bitcoin, a cryptocurrency (digital money). I found a graphic that provides a nice high level description:

High level blockchain explaination

Where do people use blockchain? Anywhere you need to establish a trust relationship between applications to complete a transaction.

Now that we have a (fragile) foundation, we can dive into the GHC workshop I attended on Ethereum: A blockchain protocol with a complete built-in programming language. The workshop was presented by Julia Chou and Jenny Lu of Coinbase. Over the course of the workshop, we created a simple smart contract as a decentralized app (Dapp).

Prior to walking through the Dapp creation, Julia provided some context for the emergence of smart contracts and the use cases for blockchain.

Web 3.0: a peer to peer (decentralized) web, without servers or authorities to manage the flow of information, instead of the traditional client:server model, every client is also a server. This architecture avoids a single point of failure. In Web 3.0, browsers and wallets are able to represent identity and assets, allowing authentication and payment without using a bank or identity service. While this cuts out intermediaries for transfer of property, it leaves a gap in the establishment of trust (we trust lawyers and banks, we don’t trust unknown users with whom we’ve never interacted) — which is where the technologies we’ve discussed come into play: all processing is done via smart contracts using blockchain as the data storage and protocol.

Web 2.0 vs Web 3.0

Solidity was the language used in the workshop for developing the Ethereum smart contract. It is influenced by Javascript and C++. It is statically typed, compiled into Ethereum virtual machine bytecode, and is Object-Oriented. Ethereum is Turing complete, so a program could run infinitely — at great expense. To avoid this there is a concept called gas, tracked by ether (the Ethereum cryptocurrency), you can set a maximum amount of gas to expend, and if you run out, the program stops executing and rolls back the transaction. That is a very condensed description of a complex concept, so if you want to read more about gas, click here.

Now we can get into the workshop…

The workshop itself was intended to be completed over the course of an hour, and was targeted at (I believe) intermediate developers. The instructions are available here:https://github.com/jp3hou/dapp-demo, if you want to try it for yourself. The presenter slides are also available in the linked GitHub repository.

To complete the workshop, I will say that if you didn’t have Linux OS CLI (Ubuntu specifically), GitHub, and programming experience, you would have struggled to complete the exercise, there were a lot of assumptions. Before you start, you will need an Ubuntu OS or Windows 10 + WSL and Ubuntu, Chrome or other Metamask supported browser, and your IDE of choice; alternatively, you can use the online Solidity IDE .

The application we deployed was a simplification of the popular CryptoKitties. The core components:

And the underlying technology we used to create the Dapp:

I won’t take the time to walk through each step from the workshop, but would encourage you to follow the instructions from the GitHub Demo. Once you’ve completed the various steps to generate the test accounts, set up Metamask, and launch the Dapp, you can click on a kat to purchase it!

Successful kat purchase!

Security Concerns:

Hopefully you know a little more now than you did before. I certainly came away with my curiosity piqued!


Living on the Edge

With the growing number of intelligent devices (and by “intelligent” I mean: An internet connected device running some form of software that collects and performs actions on data), bringing intelligence to the Edge has become a more pervasive topic of discussion. What is “the Edge?” Computing devices physically located at the point of use running software that is running locally instead of in the cloud. The cloud? Remotely hosted computing services such as storage (think: Google Drive) and infrastructure (virtual machines) — these services depend on a network connection. This sets the stage for the session: Developing Embedded Intelligence: Opportunities on the Edge. It was led by a panelist of experts in the field:

Brenda Zhuong of MathWorks (moderator), Miriam Leeser of Northeastern University, Micheala Blott of Xilinx Research, Mary Ann Maher of SoftMEMS LLC , and Yan Wan from the University of Texas at Arlington.

Panelists for the Embedded Intelligence session

The high level considerations that were illuminated by the panel of experts were:

  • Improving intelligence through data analytics
  • Gaining efficiency by taking advantage of the hardware capabilities
  • Reducing dependency on the (internet) network by hosting data and services where they will be used.

We then got into some real-world use cases that demonstrate the power of harnessing intelligence at the Edge.

Gain Efficiency in Embedded Intelligence; Michaela Blott

Reduce cost through custom arithmetic

Urban Aerial Vehicle (UAV)-based airborne computing for future Internet of Things; Yan Wan

UAV-based Emergency Communication

Developing an Artificial Kidney; Mary Ann Maher

Artificial Kidney


  • Seed neural networks with data coming from the actual environment so you can determine when sensors are malfunctioning — not all data is good data.
  • Understand power constraints — battery life has been a limitation
  • With all of the data being collected, privacy becomes paramount — always encrypting data at rest and in transit.

Some best practices…

If you plan to develop in this space, or are already working on a project, there are methodologies you can put into practice that will improve your solution. Here are a few:

  • There is so much sensor data, it is not possible, nor would it be advisable, to send all of it back to the cloud, which means the capability to do some level of analysis at the edge and send only the relevant data to the cloud is
  • Take advantage of cloud computing to simulate sensor data to rapidly design but then deploy at the edge.
  • Tagging quality data so you can establish a confidence in the sensor data with corresponding visualization (graphical representation) is crucial.
  • IoT devices tend not to have as much computing power, leveraging distributed computing models to perform calculations
  • Implement redundancy of sensors and computational devices in mission critical systems

Where do we go from here?

Computer vision and artificial intelligence are changing the landscape of computing. We have to improve the accuracy of vision sensors so that the data is reliable. The cost of quality vision sensors is prohibitive at the moment, and before development can really take off, it must come down. Research into correctness proofs is needed to verify algorithms developed for mission critical Edge workloads. Safety, especially for use cases like autonomous driving, is not a 90% correctness solution — it’s okay if your cell phone drops its connection, it is deadly if your car stops in the middle of an intersection. Portable IoT devices like ultrasound and water testing can bring life-saving technology to network constrained environments, and have the potential to revolutionize medicine.

The panelists were asked to make predictions for the next three years in Edge Intelligence (If any of these things come to pass, you heard it here first!) Some of them being:

  • Speech recognition as a human computer interface will become much more prevalent
  • Facial recognition will be integrated into more technology
  • Deployments of sensors and devices in rural areas will help to eradicate diseases
  • More innovative hardware platforms for IoT use cases

Recommended reading:

Hey, Google!

The second session I attended during GHC was a workshop entitled: Getting Started with Actions on Google. Presented by Surbhi Chaudhry, Mandy Chan, and Aylin Altiok, this workshop focused on teaching the key fundamental concepts when developing a conversational action for Google Assistant.

I own a Google Home Hub and find that this technology is particularly compelling because it is more than a convenience, it democratizes access to technology for those with physical and visual impairments.

During the workshop, Google had engineers with “I’m Here to Help” t-shirts, and it so happened that the engineer at my table did not have arms. It was not a point of conversation, but I noticed. I thought to myself, she must be very passionate about this technology — I wish I had asked about her journey, but I felt it might be inappropriate, so I didn’t.

A couple of key points that the presenters made about the different ways a user interacts with the assistant as opposed to a typical search are:

  1. Assistant queries are 200x more conversational than search
  2. Assistant queries are 40x more likely to be an action

Anecdotally, I know this to be the case because I like to ask my assistant to tell me jokes, sing me a song, and other queries that require it to exhibit more ‘human’ behavior than I would expect of my search engine. It’s gratifying to know that I am not the only one — there is data to back it up!

So let’s dig into the meat of the workshop. Over the course of the hour, we created a Google Action flow that allowed us to respond to a color query.

Google Actions workflow

You can walk through the exercise using just your browser and laptop, all of the resources are online. You’ll just need your Google account, a computer, and an internet connection.

It was great to have a working sample by the close of the workshop. The learning curve is not very high for completing a simple action query, and it is gratifying to have something working in a short period of time. If you want to play, it’s a good place to start. That being said, if you are a serious developer who is interested in implementing a more complex use case, we were strongly encouraged to start with understanding how best to design your actions. So, “Hey Developers, happy coding!”

(IoT) Mirror, Mirror on the wall

My Grace Hopper conference schedule this year was largely driven by my interest in learning as much as I can about all things Internet of Things (IoT), retail, and edge computing. Why? I’m glad you asked! I recently joined the Internet of Things Group (IOTG) as a Software Architect in Retail, Banking, Hospitality, and Education (RBHE) <– Intel loves acronyms. I wanted to take the opportunity to learn what emerging technologies can be applied in retail settings, and understand some of the challenges that are present. My background is data center and cloud infrastructure automation, so the retail space and managing a heterogeneous set of devices with varying network architecture requires a different, or at least modified, set of tools.

My first session was titled: Mirror, Mirror on the Dressing Room Wall, What Looks Good with This Dress? Presented by Allison Youngdahl and Sunil Shettigar, of Accenture, this presentation explored how brick-and-mortar stores can improve customer experience and give better data to retailers by leveraging IoT.

We have grown accustomed to a very individualized experience online, but in real life (IRL), walking into a retail location is one size fits all. E-commerce sites track our behavior, purchasing history, make recommendations, and often have free shipping and returns to seal the deal…but, once we receive the item in hand, we’re likely to return it nearly 80% of the time. Why? Just Google “online purchase fail,” and you’ll find pages of hilarious results:

Try before you buy. Even after carefully measuring, online purchase fails are rampant

We want to touch and feel an item, in fact, shoppers who try an item in a store are seven times more likely to purchase it, so the dressing room plays a vital role in sealing the deal. Fashion designer Rebecca Minkoff leveraged technology — interactive mirrors paired with RFID tags and a video wall, in her stores to drive triple digit sales growth. So what is the architecture behind such a successful solution? Allison and Sunil presented the solution Accenture Labs partnered with the Council of Fashion Designers of America (CFDA) to build.

The Retail Interactions Platform allows retailers to:

  • Understand the customer journey as they move through the store
  • Track total number of visitors in the store
  • Track item interaction from a clothing rack
  • Determine when an item is taken to a dressing room
  • Understand how item interaction leads to a purchase at the point of sale

Underlying technology:

  • Estimote Sticker Beacons (Bluetooth proximity sensors for physical objects)
  • Business Analytics – Calculate the product engagement score
  • Video analytics – heatmaps for traffic during peak hours
  • RFID Tags
  • MAC sniffers – Raspberry Pi with a WiFi adapter
  • Data visualization dashboard hosted on Google Cloud Platform
  • RESTful APIs

Product tracking was accomplished using RFID tags to “see” how the product moves through the store. Data streams from the various sensors were aggregated using Google Cloud. To track the number and frequency of visitors, the MAC sniffer identifies the MAC address of cell phones probes, which have a unique identifier, and quantify how long the individual was in the store as well as the frequency of visits. An engagement score was determined by capturing the XYZ coordinates of the product from the Estimote beacon (was it picked up and handled? For how long?); all of this is available in a dashboard.

Retail Interactions Platform Architecture
Visitor counting with data visualization

So how does this relate to mirrors?

Dressing room ambiance — temperature, lighting, music, can change how you feel about an item of clothing. Hate harsh, unflattering lighting that accentuates all of your flaws? So do I! So how can you make dressing rooms more hospitable places? Using the platform, you now have the product location, and with additional IoT devices in the dressing room, retailers can modify the environment, and also get immediate customer feedback on a product (new size needed? A different color?) to better serve you:

Mirror, Mirror in the dressing room, give me better lighting

The final take away was that retailers should focus on customer education, entertainment, and community. As much as an e-commerce site is convenient, it can never compete with a great experience IRL.

GHC 2019 Opening Keynotes

There is a comprehensive run down of the opening keynotes for Grace Hopper 2019 available via the #GHC19 Daily Download. What follows are my personal reflections on the opening keynotes of this, my third Grace Hopper Conference.

I first attended GHC in 2009 in Tucson, AZ at a conference hall that would not accommodate a single large session today. My coworker and I volunteered as Hoppers helping to pack swag bags for the attendees, which we were able to complete with around 20 other women in the course of a couple of hours.

Grace Hopper Conference 2009, Tucson, AZ

The difference between then and now is night and day. Long lines whose wait times rival those at Disney, sprawling session locations that will help you exceed your steps goal for the day three times over (my fitness tracker captured 22,135 steps), and 30 minute shuttle rides to and from the conference center. What has not changed is the awe, inspiration, and hope I feel walking into a room filled with women in technical roles. As I have progressed in my career, the number of rooms I find myself in where that is the case, has become fewer and further between.

The message communicated by every speaker this morning was that we must be advocates for one another, it is our responsibility to:

  1. Increase the visibility of other women
  2. Commit to help by getting involved
  3. Bring others with us

When all voices are not part of the conversation, the very technology we depend on to make the world a more open and inclusive place, can become rife with bias–through no fault or intention of the creators, a simple blind spot is enough. Though inadvertent, the end result is the same; solutions that are normalized on a standard that does not accurately represent the population.

The unique perspective that Ana Roca Castro brought to her career in technology, allowed her to contextualize her solutions for Genius Plaza through the lens of culturally relevant content. She inherently understood that we cannot aspire to be what we do not see. The history of colonization had removed many of the contributions made by the colonized people. By reincorporating their language and history, Ana helped young children and their parents to connect to what they thought did not include them; now, they could see themselves represented.

Dr. Natalya Bailey was told that in order to be more successful, she needed voice lessons to lower the register she spoke in to be taken seriously, change her leadership model to be more aggressive, and step aside as CEO. Fortunately, she opted not to follow that sage advice. She instead went on to raise over $25 Million in capital for Accion systems, using her high-pitched voice, and she is still CEO today. Natalya encouraged us to trust who we are, celebrate our differences, and own our strengths. By conforming to established norms, we would fall victim to the same blind spots that plague us today for fear of being seen as different — but we are there to speak for those not in the room; if not us, who?

Neural Net Innovator, Dr. Fei Fei Li was discouraged from pursuing her desire to create an image database that would serve as the foundation for machine learning in computer vision, she was told it could be career ending. She had never been made to feel so small. Three years later with the help of her graduate students, ImageNet, a database of 14 Million images (and growing), was born. Dr. Li told us that it is okay to feel small, in fact, feeling small compared to the vast Universe is what lead her to marvel at how the mind works and piqued interest in teaching machines to learn using images. What she also wanted us to know is that though we may feel small at times, together, we are big enough to accomplish anything.

Aicha Evans spoke mainly about her path to CEO at Zoox, but it was her opening quote that impacted me most:

“Nothing in life is to be feared, it is only to be understood. Now is the time to understand more, so that we may fear less.” – Marie Curie

Simply put: we fear what we do not understand. This made me think about the fear that comes with change. When we speak about making room at the table, that can make many feel afraid. There is a fear that bringing more voices to the conversation will drown out those already present; that inclusion for some, means exclusion for others — which couldn’t be further from the truth; exclusion excludes. Which brings me to the final speaker of the morning: Jhillika Kumar.

Jhillika’s unique perspective on inclusion came from the first hand experience of watching her autistic brother struggle to fully participate in life without the primary means to communicate that most of society uses; he had never spoken. In watching her brother interact with an iPad, Jhillika saw him come alive in ways he had never demonstrated, and she wondered — what would happen if we brought all of the unique perspectives of the roughly 14% of differently abled members of our society to tackle problems in technology? How many Temple Grandin’s might we be missing because we haven’t changed our thinking?

That point was driven home many years ago for me when attending a leadership program and a wheelchair bound man spoke about housing standards. He mentioned how counter heights, outlets, door widths, and many other items in a home that have to be retrofitted at great expense for someone in his condition, would have been easy to include in the standard for everyone with no impact to the majority — if his community had only been a part of the initial standards decision making process. It was eye opening. As cliched as it may sound, we are blind to what we cannot see.

I have heard it said that “I don’t see differences, I just see people,” but that is not accurate. As the speakers illustrated through their stories, we are all processing the many unique characteristics we see in people every day — in fact, we want to teach machines to do it. To the women (men, and non-binary) attendees of GHC, I want to say: I see your differences, and they are beautiful, they are valuable, they are important, so let’s bring them everywhere we go, and maybe together, we can change the world a little.

Grace Hopper 2019, Orlando, FL