Surfing the Data Tsunami - solutions for mobile data offload
Date: Wed, 10/06/2010 - 18:40
At peak times, 3G networks can barely handle the traffic surge as users scramble for high-speed, data-guzzling mobile applications. Ironically, most of that traffic is accessing Internet content on other providers' servers, while the carriers are simply bearing the traffic burden, forcing them to invest ever greater amounts of money to ease bottlenecks in the infrastructure for no greater returns
Mobile operators are looking for any way to maximize the amount of traffic their networks can support for the smallest investment. What more can be done to improve the infrastructure? The flexibility and low CAPEX of microwave backhaul is already making inroads, even in the USA where so much wireline is already in place. And how about Data Offload? It's at the heart of fixed/mobile convergence technologies - such as femtocells and dual-mode WiFi - which take voice and data traffic off the radio access network and transmit it to the core via broadband links.
In this session, Introduced and Chaired at NetEvents EMEA Press Summit, Istanbul, (September 30th – October 1st 2010) by Joshua Budd, Program Manager, Communications Group, IDC, we will discuss the several bottleneck points in the network and full panorama of solutions that mobile operators are leveraging to alleviate the problem.
Panellists: Craig Easley, Vice President of Marketing and Product Management, Accedian; Dean Bubley, Analyst & Founder, Disruptive Analysis; Amir Zoufonoun, CEO, Exalt Communications; Natasha Tamaskar, VP Product Marketing, Genband; Nigel Wright, VP Marketing - Wireless, Spirent Communications
For the event today, it's certainly brave to make predictions and come back and face the music, but also very refreshing. It certainly illustrates the challenge that I think the industry is facing; how to predict the future; how to understand how consumers and businesses will change their behaviour in the coming years.
It's been a particular challenge for the telecoms industry and we've seen a lot of innovation in the past few years. We were talking with some of my friends the other day about some of the changes that have happened in the past couple of years, and we talked about the iPhone in particular being a real game changer, and really changing the way people are using telecoms networks. And that leads into the topic of our discussion today, and that is dealing with the growing demand for data.
Predicting the future is always a hard thing, and there is a lot of talk about 3G and a lot of investment into 3G. And those networks are still being rolled out and we're already talking about 4G and advancements on 4G already. And there is already an investment going into that. Well, it's hard to make sense of all these changes and to make these decisions for the future, but I think our panel discussion today will help us try and make sense of what challenges the industry is facing.
Before we get to the panel I'd just like to make a quick introduction to put everything in context. I don't want to dwell too much on the introduction. I'd like to get to the panel so they have time to talk and you have time to ask questions. But we're talking about mobile data offload today, and this relates to the growing popularity of not only smartphone devices like the iPhone, but just the way people are using the Internet and how more and more people are using Internet over mobile networks.
And of course the operators were hoping for this. As the Voice market has matured they have seek new revenue sources and mobile data is certainly one of those sources.
So they were hoping that their customers would adopt mobile data, but in the process it's brought on a whole set of new challenges that they have to deal with.
So as the Voice market has matured and operators were looking for new revenue sources they started to invest in new network technologies - 3G being one of them - but already the demand for data has been growing, and doubling and tripling over the years. And they've had to already invest in new technologies, like HSPA and HSPA+, which is boosting the capacity in line with customer demand but facilitating even more data use among the customers.
Now a lot of the data demand these days is coming from laptops using mobile dongles, but as smartphones, like the iPhone and other devices, become more popular, more and more data is coming from these devices. And these devices are becoming more complex as people use them for all kinds of applications beyond just basic Internet connectivity. And this is now leading operators to look into even newer technologies, such as LTE and LTE Advanced, to add even more capacity to the network. And this is in response, not only to the fact that people are buying more and more smartphones, but people are downloading more and more applications which require more data, more bandwidth.
Now how to manage all this? It's one thing to have the network, it's one thing to have the capacity, but to put it all together and maintain a good quality of service for the customer is incredibly important. And we saw in The States, with issues related to iPhone dropping off and session delays, and problems, that this can really impact how customers use the network and what kind of operators they choose to subscribe to. So managing data is a big issue now that operators are faced with, and we're going to talk today about the different options that operators can explore to handle all the data that is being transferred over their networks.
What I'd like to talk about today with the panel is some of the key challenges that the operators are faced with and how they can deal with all this. I've identified a few here, but I hope our panellists can expand on all this, and I'm sure they will. But there are a few key issues that I'd just like to kick off the conversation with and we'll see where it goes from there.
First of all, multiple sources of data. There are different types of traffic that are going over the networks these days - it's not just simply Internet traffic. But there are more and more applications being downloaded to handsets and this is changing the type of data that's being transferred. It's not just Internet traffic, but operators that want to invest in premium traffic, their own data traffic that they have to manage as well and make it work altogether.
On top of that the devices themselves are becoming very complex, and there are all kinds of signalling overhead and messaging, and devices are becoming what they chatty, which is adding even more pressure on the networks. So we're going to talk about this issue. We're going to talk about the fact that now operators are running multiple networks together and parallel networks. So we have 2G networks running 3G networks, 4G networks running parallel to 3G networks. And how do you manage all of this and make sure that the customer doesn't notice any difference; that their data session just works?
Where in the network are the bottlenecks taking place? They're in the access, in the backhaul, in the core. There are bottlenecks taking place all over the network and we have to find out where they're taking place and how we solve them in those places.
And ultimately how do you choose the right strategy? Each operator is different.
Each operator has different objectives. And I think what we'll find today in the discussion is that there isn't really a one-size-fits-all solution to deal with what we call the data tsunami; that operators are going to have to look at a number of different solutions and maybe combine a number of different strategies together.
So this is just to get the discussion going. I'm going to skip biographies - we don't have too much time. So I'll just give a quick overview of our panellists today. We have Craig Easley, Vice President of Marketing and Product Management for Accedian Networks; and Natasha Tamaskar, Vice President of Product Marketing for Genband; Amir Zoufonoun, CEO of Exalt; Nigel Wright, Vice President of Marketing for Wireless, Spirent Communications; and Dean Bubley, Analyst and Founder of Disruptive Analysis.
Well, Dean, I think I'd like to start with you. You're often in the position of asking lots of questions to panellists at these events, so I guess we get to turn the tables on you now, and it's a good opportunity.
When we talk about mobile data offload it could mean different things to different people. On a basic level it could just mean adding boxes and devices to the network to help manage and offload some of the data. But you've been writing a lot about this on your blog and I though maybe you could tell us what you mean when you talk about mobile data offload.
Well, offloading is something that's been around in terms of mobile data for about two to three years now. We're probably now on the second or third phase of offload. And I think it's very, very context-specific, in it's offload from a device, whether it's a smartphone or a laptop or something else, onto, typically, a cheaper and simpler network ideally paid for by someone else other than the operator, but with the operator hopefully retaining some sort of control.
Typically, it involves switching a device onto a local network, whether it's WiFi or via femtocell, or, if that's not possible, to avoid the macro-cellular radio traffic going all the way back through the core network and straight out to the Internet. Because 95%, 99% of mobile data is going to and from the Internet and it's expensive to put that through all the complex machinery in the cellular data environment - the SGSNs and GGSNs. And if you can just dump it straight to the Internet, potentially, that's easier.
But then you have a whole range of issues as to whether WiFi and femtocells. WiFi sounds like a great option. Every PC has got WiFi, pretty much every smartphone has got WiFi, but you then need to think about to control the connection manager software on the device. And usually the operator doesn't have a huge amount of control about when and how it connects to WiFi, particularly if someone's gone and bought a 'vanilla' device rather than getting a subsidised one from their operator.
Just out of interest, how many people in this room use pre-pay, or is everyone here on a mobile contract? 70% of the planet uses pre-pay, so that's the normal way of getting mobile services. And you'll see this increasingly with mobile data as well, which means that pre-pay customers usually get their device from someone other than their operator. They go to a market stall or an electronics retailer and it's very difficult for the operator to pre-load software on that device which facilities when and how it connects to WiFi.
Femtocell is not a problem. It sounds great in principle, but it needs fixed Broadband.
So if you've got fixed Broadband with a femtocell on it, that's great, but not everyone has fixed Broadband and not every place has fixed Broadband. There are a lot of different scenarios - there is no one size fits all when it comes to offload.
Ok, great. Well, how about some of the other issues? We're talking about femtocells and WiFi hotspots. What about operators employing different service plans to maybe help discourage customers from using too much data? We're seeing operators abandoning the all-you-can-eat packages. How does that fit into the whole strategy of dealing and managing [what they're doing]?
The behavioural change through pricing is one of the things you've seen. To be honest, in Europe it's been common since pretty much day one of mobile Broadband.
You have 3Gb per month, or 1Gb, or fair usage contracts. The US has been slightly different because it's tended to have an all-you-can-eat completely flat rate, but that's been unusual.
You now do have more granularity on pricing and the way that pricing is enforced to try to change people's behaviour. So you might be able to say you get this amount at this speed, and then if you go over a certain amount the speed drops down a bit, or it's cheaper at weekends or off-peak hours. There are ways of pushing people's demand patterns for their data. There might be a different data plan for an iPad, an iPhone, a Blackberry and a Huawei dongle, because they know that if you have an iPad you do lots of video, if you have a Huawei dongle you do lots of long session of Web browsing but probably not huge amounts of signalling going with it. And so you can configure your pricing in such a way to push demand around as well. But it's a bit of a blunt tool.
But would you agree that operators - and I think you've used this term before - need to take a more holistic approach to managing the data; that it's really not just adding boxes or something like that? It's a wider strategy of overall managing the data (multiple speakers).
I think absolutely. One of the problems we've seen with a lot of the offload is it has targeted one particular bottleneck and often it's pushed the problems somewhere else.
So you see this quite often within the mobile world where people try and put a box to compress video traffic. Or, for example, I had one vendor talk to me about, we can see when someone's doing YouTube over mobile and we know that a lot of people will download a five minute clip but then just close the window after 30 seconds, and so 5Mb has been wasted. So we've got a way to drip feed through and buffer the video in advance.
And I said, "well, doesn't that mean that the radio network then keeps the radio connection live for all five minutes, rather than just pushing all the 5Mb through and switching the radio off and it has an impact on the battery"? And that vendor said, "oh, I hadn't thought about that". And the problem there is Spectrum is more
expensive than GGSNs. So you absolutely need to think about all of the various bits of the puzzle together. And we're only just starting to see that strategic thinking about traffic management now. It's been very much fire fighting, tactical, fix that box there, this there, without actually joining the dots.
So we talk about limiting data and we mentioned some of the techniques that they operators are doing to control behaviour. One issue is, is there enough Spectrum, and you talked about this issue, but capacity in the backhaul, and that's something I'd like to talk about - it's a big part of the puzzle. And one of the issues is, is there enough backhaul to handle all this demand for data, even if there is enough capacity. Craig, I'd like to ask you how will the migration to LTE affect investment in the backhaul.
Are the operators prepared for this and what are some of the challenges in the backhaul related to managing data?
So, to Dean's point that he made, I think we're starting to get enough Spectrum to actually have adequate capacity between the radios and the handsets, whether that's over LTE, WiMAX, or even WiFi. The challenge really is, then, in the backhaul network, where you don't want to build and manage separate networks for Voice [off limiting]. It's very confusing and complicated to manage all that stuff. So ideally you take a very low cost, very high capacity bandwidth technology - a carrier Ethernet seems to be the one of choice - and then put the intelligence in that network so that you can carry all the different data types.
You have the Voice that's all going Voice over IP or being carried as emulated circuits over the Ethernet. You've got the data that everybody's afraid of and you've got your video. But there's a fourth type of traffic that's necessary to run all these towers successfully, and that's the synchronisation data. And synchronisation data carried out to a cell site requires a very precise service level agreement so that all those towers can stay synched up so that the calls don't drop. So providing the intelligence on that raw capacity of Ethernet is really critical to the Etherisation of cell towers.
I think now that timing distribution has been standardised with 1588 you'll see more and more implementations of that working their way into the base stations and, so, leveraging Ethernet to deliver that will enable Ethernet to have wide-scale adoption here in Europe as we've seen it in The States.
So the backhaul is becoming more intelligent. It has to deal with more and more data so it has to become more and more intelligent.
It has to. It's not just being used for data offload anymore. It's carrying all different types of data - [best ever] traffic as well as very precise timing and synchronisation information - so it has to be more intelligent.
Amir, I'd like to move on to you, actually. I think you could add some interesting points here.
Capacity is always a challenge in the backhaul, no matter what, no matter how intelligently you're managing the traffic. One of the big problems with capacity, particularly in EMEA, is just the cost of adding more and more capacity - it's a major challenge for the operators. You're very involved in microwave technology. What's going on in the microwave sector that could help operators deal with this problem?
Yes, quite a bit, actually. Microwave, in recent years, has seen a very, very accelerated rate of innovation and that has caused the price and cost of microwave to come way down while the functionality is going up at the same rate. So it's following something very similar to Moore's Law. And what has done is that now we're getting to the point where we can get a 1Gb connectivity, 1Gb per second between cell towers for very little money. The cost has gone way, way down.
And obviously there is elasticity in this market and that will cause the rate of adoption to, basically, increase, of these microwave devices. Of course, in EMEA over 70% of cell sites are already using microwaves, but we're talking about your grandfather's microwave, 4 E1, 8 E1 type microwaves radios. Those radios have to be ripped out and replaced by these Ethernet-based radios that can carry up to 1Gb, or even more, capacity.
One more prediction over the next five years we can add is that microwave technology will get to the point where you can get gigabits worth of capacity for less than $1,000. When it gets to that point, then why not add more and more microwave to the network along the way, in the entire chain of backhaul going to the core, to offload traffic, to create loops within the network and bring the cost per bit way, way down?
Do you think it is becoming financially more feasible to really address the growing demand for data in the backhaul for the operators?
Absolutely. The technologies are there today, but with operators, as everybody knows, it takes time. Adoption is not that fast in that market.
Amir, are you not, though, indirectly competing with fibre?
In a way we are, but if you think about it, the capacity in the fibre network is also going up. So fibre keeps going from 10Gb, to 40Gb, to 100Gb, while in the [spare] networks, in the spare parts of the network, off the backbone, effectively, what you see is, first of all, more nodes that need higher capacity. Because of the move towards LTE we will have to split these cells due to laws of physics. As that happens, it just doesn't make sense economically to dig up the street and run fibre to every node. So fibre and microwave will work in unison and become friends as opposed to enemies.
But if you take the offload and the small-cell theory to its conclusion, and we end up with lots of other femtocells or other small cells, you're not going to put a microwave on a femto.
When the cost comes to the point that it's basically a credit card purchase, I do see that some day we will see microwave in a box, basically. You have a hop in a box and you can purchase that and deploy it anywhere very simply, just like you do APs today for WiFi. There is no reason why that cannot happen.
You brought up an important work area of femtocells. Natasha, I think maybe this would be a good point for you to jump in.
It seems like we're addressing the issues of capacity in the backhaul and this should enable a smoother rollout of LTE. When people talk about LTE they often talk about rolling out femtocells with LTE deployments. How do you see this [take] coming on in the future in the role of LTE, and femtocells helping to offload the data?
Femtocells are being used to offload data today, and that's primarily for 3G-type networks. But what we have seen is that, as people are looking deploy LTE, the frequency range that is used for LTE in some cases is such that in-building penetration is still a huge issue. And that, combined with the fact that 40% or more of the data traffic is used in home, creates a bigger problem. And so we are seeing that a lot of the service providers that are looking to rollout LTE are starting to use femtos as a solution that is working in conjunction with their total LTE rollouts as well.
Typically, in terms of femtocells, initial problem statement was all around coverage problems, but today that is starting to move towards something that can be used for offload for some interesting reasons. And typically you can do offload of traffic globally. You can say, ok, I see there is a lot of traffic so let me just offload 30% of the traffic from a certain region out of this macro-network. Or you can be more selective.
As an example, only two days ago we had Vodafone here as a service provider and they talked about a specific region in the African Continent. And in one of the countries they have an issue where they had a region with about 30,000 or so subscribers. And out of the 30,000 subscribers they had a huge congestion problem.
And as they narrowed down and found out what was happening, they found out that this congestion was actually related to five subscribers. Five subscribers were using their homes actually as virtual data centres or whatever you may call that.
So if you could isolate these types of subscribers by using technology such as traffic management or DPI, if you're not shy of using that name, you can use them to offload certain traffic using femtocells regionally. So you don't have to say I have to put WiFi every single place in the network. You can say I already know who these 10 subscribers are, I will offload this using femtos.
So a more targeted deployment rather than just a blanket rollout?
Craig, did you have something?
Yes. That's absolutely the way that it needs to go. You need to find those super-users and make sure that, first of all, you know where they are and what they can do to your network and, second of all, to put yourself in a position to charge for some of that bandwidth capacity that they're using.
Even in The States you're starting to see a more pay-as-you-go model. I bought my iPad on the last day that the AT&T was offering their unlimited data plan. So I put it off as long as I possibly could, but I knew that 2Gb a month isn't sufficient. And just now, being over here for about a week and completely exhausting the 100Mb a month that I purchased before I came over, belies the fact that people are going to be making money on the data that's coming into the network, so operators are going to be incented (sic) to build higher capacity, higher performance networks to support that so that they can capitalise on that opportunity.
But I think what's happening is you're seeing all these little things pop up that maybe the operators weren't anticipating; the example you gave in Africa of the five users; all the different data plans. It seems to me that it's adding a great deal of complexity to the market as well. One service, all you can eat, seems kind of simple, but it's not manageable, so it's adding a great deal of complexity.
I talked earlier about all the multiple networks that are running. We talked about femtocells being added on top, WiFi. It seems complicated. Nigel, I think this is a good spot for you to maybe jump in. I'd like to ask you, with all this complexity that's being added into the network, how can operators guarantee or ensure their customers that there will be a good quality of service, reliable quality of service?
I think there are multiple issues here. The role of the device itself also shouldn't be under-estimated. You were talking about some of the chattiness of these devices at the beginning of the presentation here. We've been developing some test methodologies for the impact of services on things like device power management.
And I'm not intending to pick on Google here, it just happened to be the first device up.
But we looked at an android device, for example, plain vanilla, straight out of the box, just ran it to see how often it tried to access the network over the course of an hour. It made 572 attempts to access the network, and this is a plain vanilla device straight out of the box with just the embedded apps in the OS, the OS itself, and with whatever else the operator put on it. We hadn't spent the morning in the apps store. This gives you a feel for the baseline scope of the challenge that operators have to deal with.
I think one of the good things is that the scalability of networks going forward, some of that actually is already in place. We've just done, I think, the biggest network infrastructure test in history with EANTC, in a lab in Berlin, actually looking at some of these next generation networks from Sisco and the number of all IP subscribers that it might be expected to support, for example, the aggregation layer over some of these big routers. We had 750,000 subscribers going through an aggregation router driving IP traffic. We had more than 1m subscribers attached to a GGSN and attach rates of 18,000 subscribers a second.
So the ability of these networks to scale going forward, everyone's very concerned about the risk and will we actually get there. [I tell you] some of that's in place today.
This, obviously, was a best case. This was a single vendor, it didn't take into account all the inter-operability issues and so on. But I think at least a part of that capacity and scalability, the technology is there today, so I think some of the risks going forward are, perhaps, mitigated.
I'd like to follow that up. I'd like to ask you, and then open up to the panel, you talked about these chatty devices; do you think maybe the industry needs to co-ordinate a bit more and be more responsible with the way they use the Spectrum? Do you think there's enough being done in that area?
I think there is definitely an opportunity for improvement. At the moment, particularly with an open OS like Google, people can embed whatever they want in it so this baseline chattiness is not really controlled by anybody. And in environments like Dean was talking about earlier, where you can buy your device from anywhere, the operators aren't controlling what's on there. It's a really tough situation.
I think in terms of chattiness it's worth noting there are two or three different sorts of it as well. There's the application, say, like a presence client pinging back to the server, but you've also got the actual radio part of the phones behaving differently.
And the problem here is that some of the standards are not implemented by every vendor or every operator, or they're implemented inconsistently.
A couple of smartphone manufacturers are known to do some fairly aggressive things in their radio part of the chip which helps them control the battery life on the phone, but it causes problems in the network, because certain of the networks haven't implemented particular features in terms of power management and power levels.
Essentially, particularly with HSPA, you've got what's called 'active state' which takes a lot of power and then the phone drops down to an idle state and below, and there are a couple of intermediate ones. And if you can do so that you take the phone's radio out of active state quickly, you save battery on the device. The problem at the moment is that some of the phones expect that capability to be in the network; it isn't.
So instead they do almost like an emergency thing, where they tell the network to force them down to a lower power state, which it can do. It saves the battery on the phone, but the network then loses track of where the handset is and has to poll all the base stations and you end up with a signalling stall.
And, actually, only recently has some of that been recognised by the operators. We've been doing a lot of work with AT&T in the US on this wonderfully-term 'device aggression management', which is exactly what Dean's talking about. So if it doesn't find the server, for example, what does it do? Does it keep on hammering the network with requests? Is there a timer in there? And, again, there are no real standards at the moment for how that's being dealt with.
Verizon has had that as a big KPI for their network for a long, long time, but only recently have other operators started to realise there's a huge envelope of variation in how these devices perform and how aggressive they are in their interaction with the network.
I'd like to keep on going, but I think maybe I could open up some questions to the audience. Who wants to go first?
Luke Collins from Engineering and Technology in the UK. I'm wondering if anybody knows any strategies to get apps developers and operators and network equipment providers more co-ordinated, so the developers could be encouraged to write 'gentle applications', if you want to use that term, which don't trash the networks in terms of network signalling and so on.
Yes, there certainly are some operators. I was talking to the CTO of a major operator last week and they are definitely working with developers to create network-friendly applications, both in terms of volumes and also in terms of this signalling. So I think you're starting to see that, but it's going to be quite a slow process because it's also going to involve the OS vendors and the device vendors in that same process.
But I think creating network-friendly applications is good for app developer because the app will run faster and more reliably, but it's good for the end user because it'll cost them less and, hopefully, the battery will last longer as well. And it's good for the operator, so it's a bit of a win, win, win, more so than an attempt by operators to try and charge application developers for deficiencies in their network, which I don't think is going to fly.
Have they tried that?
They've been talking about it. Whether it's individually or trying to put a tax on Google or whoever. It's not going to work, though. It's basically saying we're Tring to avoid being dumb so can you tax the clever people for us. It's not really a winning strategy.
I think there's a very, very wide variation in the willingness of operators to push back on those vendors, though. North America I think is probably famous for the demands of the operators because they control, completely, the supply chain. But here in EMEA I'm constantly surprised at how unwilling operators are to actually push back on those vendors when they have badly behaved applications and so on.
I think it's also a learning curve. I was at an app development conference a couple of weeks ago in San Francisco and there were a couple of workshops on how to write network-aware applications for the vendor communities that were being sponsored by AT&T and Verizon. As it was on the enterprise side, getting some of these big oracle and SAP apps network aware and network friendly took a decade.
And I think a lot of people learned and remembered the pain that we all went through on the enterprise side and, so, getting out in front of that with education and awareness and convincing the app developers that if your app is network aware and network friendly it's going to behave better; it's going to lead to a better quality of experience from the person that's actually using it.
One of the biggest culprits on this is actually 3G PP, because for years they've been insisting that applications should be network agnostic. And they've created things like IMS with the idea that applications can run on any network the same way and don't have to be aware of the network, whereas, in fact, they do need to be aware of the network because, otherwise, not all networks are created equal.
Good point. Anyone else?
[Hans Neeman], freelance. About operators controlling handsets, there are two types of handsets - fully configured by the operators supplied and, of course, the open market versions. But if you look at Apple they are already [smart]. They have a different approach. In fact, once an operator has a contract with Apple they force all their handsets, independent of where they're coming from, to follow the same guidelines. Will that be a solution to the problem? So instead of giving the open handsets all the freedom to configure themselves and use the settings the customer likes, now you're forcing the handset vendors to supply one preferred configuration to the handset.
That does sort of solve the problem from our operator perspective. It's kind of heavy handed on the user community, I would say. And there's a big backlash against big brother making these decisions for us.
I think it's going to depend on the business model. Where a device is subsidised by an operator, the operator has morale authority to tell the user how to use it. If it's something like a PC with a dongle, forget it. This is why the operators have been trying to sell PCs with embedded modems and no-one wants them. Because a PC is something you purchase as a user or a company and you set it up the way you want.
You don't let Orange or Vodafone or Verizon tell you how your PC works.
I think smartphones, Apple and also Blackberry are slightly unusual cases, because they've tied the device very much to, in Apple's case, to the iTunes and app store and, in Blackberry, to their own Clouds and Blackberry Enterprise and Internet services.
So it's a slightly different situation there. But for, particularly android, given how it's obviously open, there's going to be an open market in cheap smartphones. I think Google wants people to be able to buy a $70 or $100 android handset in an electronics retailer or in a market stall in Sri Lanka, because that's the way to get around operator control. So it's going to depend on the circumstance.
Unfortunately, I do have to cut it off here. We have to move along. So thank you very much for all your time and comments and thank you for your questions.