National Association of College Stores
Campus Marketplace e-Newsletter > News03 People-Counters Fine-Tune Stores’ Conversion Rate
June 2016 — Michael Von Glahn
Conversion rate is among the most basic numbers for any retail operation, indicating the proportion of all shoppers who actually bought something. For example, if you have 300 customers in the course of a day, but only 75 make a purchase, your conversion rate is 25%.
But campus stores that rely only on transaction data from their point-of-sale (POS) system, which doesn’t account for people who leave without purchasing anything, don’t have enough information to calculate their actual conversion rate.
In an effort to harvest more accurate numbers, a small group of campus stores, including Titan Shops, California State University, Fullerton, and 49er Shops Inc., California State University, Long Beach, took part in a six-week NACS Retailing Council pilot program using devices from We Count People (https://wecountpeople.com ), a reseller of patron counters, traffic counters, and people counters. The pilot launched in the stores December 2015 and January 2016.
The CAMEX 2016 Flash Session Analyze Your Conversion Rate, Increase Your Productivity detailed how the counters revealed data useful not only for better determining conversion rate, but also for fine-tuning staff scheduling and displays and gauging the success of promotions.
“If you only count sales, you’ll never know the opportunities you’re missing,” said Kimberly Ball, director, Titan Shops.
Say 1,000 people come in but only 600 buy something; that leaves 400 people who departed the store empty handed. Maybe they never intended to buy anything, but they were still potential shoppers. Adjusting displays, specials, promotions, or customer service might transform that flow-through into transactions.
A more accurate conversion rate can also help improve staff scheduling. Ball’s co-presenter, Cyndi Farrington, CCR, operations manager, 49er Shops Inc., said she’d been letting her frontline cashier manager know that she thought he was scheduling too many cashiers when the store first opened.
“He would always respond back to me, ‘Well, we have a lot of customers in the morning.’ What is a lot? Your sales don’t show it,” she said.
When Farrington analyzed data from the counters, she found that the store was only doing 3% of its business in the first two hours of the day. “So there was no need for us to have four or five cashiers at 7:15 in the morning, when noon is really our busiest time,” she noted. “We can adjust that labor to move it toward our busiest time to really help take care of our customers when our customers need to be taken care of.”
The pilot, initially planned to launch early last fall, got a late start because it took longer than expected to secure funding. Farrington got her devices in place the week before classes started, so she’d collected only about a month’s worth of data before CAMEX.
Monday is the busiest day of the week at her store. Comparing the device counts for Monday, Feb. 15, to her POS data, Farrington found that between noon and 1p.m.-with no classes, it’s the busiest hour of the day-the store got 13.96% of its visitors for the day and 13.98% of its total sales for the day.
However, there isn’t always that close correlation between traffic and sales. For instance, between 5-6 p.m., 49er Shops only saw just 4% of its total customers for the day, yet made 10.42% of its sales.
We Count People software creates a weekly graph showing traffic by day and by hour, so trends become apparent. A larger store with multiple entrances might discover that one of those entries gets so little traffic that it could be closed off, while a busier entrance could be ripe for placement of more promotional materials.
Some sales or traffic data may need to be fine-tuned or filtered out for greater accuracy. For instance, Titan Shops conducts refunds outside the store during back-to-school, so while those transactions count in Ball’s POS data, they’re not actual customers walking through the store. At the same time, online order pickups are done inside the store, but those transactions aren’t reflected in POS data.
The battery-operated counters are about the size of two cellphones and have a maximum range of 40 ft., although ideally they should be mounted 20-30 ft. from each other. Ball mounted her units with 3M tape. A laser beam on one side counts people coming in, while a beam on the other side counts those going out.
Users download the data onto a flash drive and then run reports onto a spreadsheet. Ball said the software is “super-simple.”
Farrington agreed, saying, “The hardest thing about the whole process for me is that our IT department has to load all the software on our computers. That was the hardest part: waiting for IT.”
She handed off data entry from the units to a student worker. “We run reports out of WINprism,” she said. “It’s not so complicated that I have to do it, which is good because I don’t have any time.”
The devices can produce information on far more than just the comings and goings at entrances. Sited strategically, they could monitor how many people head into the tech department or any other discrete zone within the store.
Placed near the fixture for a sale item, they can gauge how soon and how much traffic bumps up after the sale is promoted on email or social media. They also offer opportunities to track how long people wait in line, how many people stop to engage with a window display, how many then come into the store, etc.
Both presenters said that even a small store could afford We Count People’s technology, with the devices ranging from $400-$700. “Cyndi already paid for the cost of the unit within a week or so by adjusting hours, maybe better service,” Ball said.
Even though the pilot period ended, she and Farrington continue to collect data and will be preparing a case study over the summer for NACS to share with members. “It is probably best suited for a yearlong analysis, to have a complete picture of traffic fluctuations,” Ball said.