Being able to create three dimensional models is essential in the design and optimization of mobile networks, especially in dense urban environments where small cells are being deployed. The influence of buildings and other man-made structures on network design and optimization has never been easier to understand, both as an input to 3D-traffic maps and its impact on propagation. The 3D features of the Metro Network Design Package, part of InfoVista’s recently released Mentum Planet 5.6, help mobile operators overcome the urban network design challenge and produce accurate predictions and analyses that optimize the whole process. Read the rest of this entry »
By Bernard Breton, Chief Marketing and Strategy Officer, InfoVista
As you’re likely aware, InfoVista acquired RAN and backhaul network planning optimization provider Mentum last November, strengthening its presence within the mobile market. As a result of that acquisition, I have been ramping up as chief marketing and strategy officer, a new role at InfoVista and one that I am thrilled to take on! Part of this new role is to define a unified strategy for these two complimentary companies and provide combined offerings to help operators better plan, operate, optimize and monetize their networks.
One area where I see great demand for these capabilities is in regard to the explosion of data demand across mobile networks. As I see it, mobile operators are stuck in a difficult position because they can’t generate enough revenue from this increase in demand to offset the cost it creates – particularly when the data demand is generated by OTT services.
Consequently, the delivery cost per bit must be reduced, and mobile operators are looking for the best method to cost-effectively and efficiently manage their networks’ capacity, quality and coverage. I don’t believe that there is a silver bullet for this problem. However, by improving the efficiency of all functions associated with the delivery of network capacity – from planning to optimization – as well as improving the utilization efficiency of the infrastructure, mobile operators can take steps to address this multi-faceted challenge.
This means optimizing and effectively managing every inch of a network through the implementation of best-of-class methods and tools. It also means taking further steps to address areas where the demand exceeds bandwidth availability, such as implementing small cells and deploying new technologies like LTE. To maximize the return of these initiatives, operators will often put in place advanced service performance assurance systems across the all-IP mobile network, redefine their capacity planning approach and increase the level of automation, such as with the introduction of self-organizing networks (SON).
One week removed from Mobile World Congress, we can certainly say the event was a success for InfoVista. We received valuable feedback from attendees about the relevance of InfoVista’s acquisition of Mentum in light of growing interest in service performance assurance, and wireless network planning and optimization. Keep an eye out for more exciting announcements from us in 2013 as we will continue to focus on delivering technological innovation that helps improve mobile network efficiency.
There was a noticeably high level of activity around small cells at this year’s Mobile World Congress, begging the question: what are they exactly? The Small Cell Forum describes them as “low-power wireless access points that operate in licensed spectrum, are operator-managed and feature edge-based intelligence.” They clearly spell out the three types – from smallest (femtocells) to largest (metrocells & microcells).
One of the driving forces behind the creation of the LTE Release 10 specifications was the desire to fulfill the requirements set out by ITU for IMT-Advanced in order to acquire the 4G label. Considering the increased peak data rates as the primary target, carrier aggregation is clearly the most significant improvement as it allows for much greater theoretical data rates and lower latencies, simply by leveraging all possible non-contiguous carriers allocated to a wireless operator. While reducing the need for contiguous spectrum, carrier aggregation also provides interesting side effects that are of great benefit in HetNet deployments. For example, with cross-carrier scheduling the resources of secondary carriers can be scheduled on the PDCCH of the primary carrier, which reduces the interference to the control channel. Similarly, there is no need to perform inter-frequency handovers between secondary carriers as long as the primary carrier provides coverage. Along the same lines, advanced multiple antenna capabilities bring the spectral efficiency of LTE-Advanced way beyond the ITU requirements. On the downlink, 8×8 MIMO is achieved with the introduction of additional transmission modes that rely on the extensive use of UE-specific reference signals. On the uplink. a maximum of four spatial layers is now available to devices with four transmit antennas (rather large devices such as laptops and tablets), complimented by Multi-User MIMO (MU-MIMO) where two terminals can each transmit up to two spatial layers.
Read the rest of this entry »
When it comes to the management of a wireless network, there is a natural and logical tendency for engineers to focus on engineering matters. This often means disregarding the implementation of sound practices for network planning database management. Unfortunately, the cost of loose data management practices is not immediately noticeable and it often takes years before an operator truly realizes the gravity of the situation. Often, it is a new initiative that makes it obvious that the network planning database is inaccurate or inconsistent. In any case, the cost for an operator can be very substantial, ranging from a lack of efficiency, to errors in the network database, which can lead to expensive mistakes such as erroneous network build outs. Read the rest of this entry »
While the use of network-based measurements in the design and optimization of wireless communication networks is on the rise, propagation modeling remains a core practice for anyone involved in wireless communication. Propagation modeling, unlike measurements, can be used to determine new radio configurations, which is a critical element in the design process. Measurements, on the other hand, can only be used to model current or existing configurations.
There is no doubt that the ability to model propagation conditions when evaluating the coverage of a candidate site or assessing the impact of a radio optimization is paramount across the entire lifecycle of the network. According to our most recent survey (November 2011) of how wireless planning tools are used, engineers in all geographies and markets are looking for further improvements in propagation model accuracy while at the same time wanting to minimize the amount of model calibration required to achieve such accuracy. Read the rest of this entry »
How do you define software architecture? Everyone will have their own definition. There are many definitions out there, and most of them are very much software-oriented; an example is the one chosen by Wikipedia which defines software architecture as “the set of structures needed to reason about the system, which comprise software elements, relations among them, and properties of both.”. Microsoft’s definition is instead fairly high-level and abstracts very well this subjective notion: “Software application architecture is the process of defining a structured solution that meets all of the technical and operational requirements, while optimizing common quality attributes such as performance, security, and manageability. It involves a series of decisions based on a wide range of factors, and each of these decisions can have considerable impact on the quality, performance, maintainability, and overall success of the application.” Another interesting definition comes from Martin Fowler: “The highest-level breakdown of a system into its parts; the decisions that are hard to change; there are multiple architectures in a system; what is architecturally significant can change over a system’s lifetime; and, in the end, architecture boils down to whatever the important stuff is.”
Predicting radio channel behavior in an environment where wireless networks are deployed has always been an important but challenging task for engineers due to the fact that radio wave propagation models fulfill two contradictory requirements: (1) Predicting radio path loss with the highest accuracy and (2) Doing so while being computationally efficient to allow operational usage with large radio networks.
With the first 2G wireless network deployments in the 1990s, propagation modeling based on empirical models, with additional corrections based on analytical methods, became very popular. This was mainly because more advanced modeling techniques would have required more computing power than was available at that time, as well as a highly accurate, but often unaffordable, geographical database describing terrain and buildings. These models lacked accuracy because propagation phenomena were modeled in a simplistic manner. Furthermore, such models could not accommodate different propagation environments without recalibration, making them costly and very complex to deploy operationally.