Modern campus network management as a journey; The devices we manage constantly evolve and how they interact is quite fluid over time as new protocols, security mechanisms and redundancy solutions emerge. The techniques we use to manage these dynamic systems must respond to, and even anticipate this changing landscape. Hi, I'm Steve, and I'll be talking with you today about several key aspects of this network management journey and introduce you to some technologies and techniques to help you on your way. I will start with network management challenges that drive new technologies and services. Then you'll explore classic NMS tools because they're still out there and they remain viable solutions for many organizations. But modern NMS tools have been developed that reduce overhead, speed, diagnosis and improve situational awareness. We'll end with a brief discussion about Aruba's Edge Services Platform or ESP. It's a compelling solution. But for now, let's home in on modern campus network management challenges with a focus on the topics you see here, starting with a discussion of Cloud services. As you may recall from the modern campus network design video series, The National Institute of Standards and Technology, or NIST, defines Cloud computing as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service writer interaction. It's a virtualized infrastructure offered as a service to be consumed via the network. Back-end servers, storage, compute resources, and security are all key components of Cloud infrastructure, consumed via some front end platform. A PC, tablet, mobile phone, or a computer network. Cloud migrations require careful planning and significant time, financial and staff resources. Why the big push? It's all about saving money, improving resiliency, adding the intelligence, and offering new and better services. There is financial flexibility. OPEX models and scalability eliminate wasted expenditures. Speed of deployment, no need to assemble, test, and deploy complicated systems and solutions on their own. Access to latest functionality. Service providers continuously evolve their technology, passing along improvements and innovations to customers and staff and skill set augmentation, so customers can access external expertise to complement or augment their own IT staff. Still, it's not without challenges. Depending on the deployment and vendor, both the provider and customer have security responsibilities, which should be written in a service level agreement or SLA, or otherwise formalized, starting with security. In general, Cloud provider responsibilities are: Physical datacenter security, Cloud Platform security, especially as relates to hardware, software, and the network, and the need to address security issues without interrupting service to the customer. Meanwhile, customers are generally responsible for the organization's security policy and ensuring these policies are met in an SLA with the provider. Security testing of the infrastructure components, security of applications developed by the customer and security of VMs, Containers and data. Now there's compliancy issues. You must meet legislation and standards for privacy and information processing. These compliancy issues are often government-mandated, especially for vertical markets like health care and finance. Now I'm thinking about things like HIPAA and PCI compliance regulations in the US, and the European General Data Protection Regulation, or GDPR, for storage of personal data. Your organization may have their own compliance guidelines and you must ensure that service providers meet your requirements. Well, a public model might introduce security and compliance issues. For certain very large-scale deployments, operational costs can be high, but this is often offset by the low Capex or upfront costs. Finally, staff may require new skill sets and paradigms to work with cloud vendors and technologies. These are a few challenges related to centralized cloud-based services, which can be compounded as our userbase becomes more decentralized and devices become more varied. How do you extend the campus experience out to these desperate users, at home or even in their local cafe? These users must have the same experience, the same security connectivity and manageability. Some remote offices might have a gateway with firewall and virtual private network or VPN capabilities. Home offices may have one or more access points or APs, with a wireless LAN or WLAN for home use, connected directly to the internet via a firewall, and separate WLAN for business use, all securely tunneled back to the campus. Same user experience, same management visibility, as if they were on campus. The same holds true of employees working from a hotel with special secure VPN solutions like Aruba's VIA client. You have users connecting from various locations using various devices: PCs, tablets, and smartphones. Now, there are challenges here to be sure. Management Solution should be able to identify these devices along with the who, what, how, when, and where context information. These endpoint issues become even more challenging with the proliferation of Internet of Things or IoT systems, which often consists of sensors, actuators, and controllers that help to integrate our digital, physical, and biological worlds. Sensors can detect nearly anything; water pressure, heat, smoke alarm, proximity of expensive equipment, and valuable humans whether somebody is waiting at a traffic light or a cross-walk. These systems can create a lot of data and lack security features. As IoT systems proliferate, we must respond with the ability to efficiently handle, process, and store big data. We must keep these devices available with good redundancy and tight security. Imagine a hacker accessing some unprotected IoT device and using it as a launchpad to attack the rest of your internal systems; not good. These IoT systems require a more responsive, tighter approach to security. They're are also increasing the need for real-time decision-making and problem resolution. Suppose that some sensor detects something like temperature, pressure, or maybe fluid level. Perhaps this is some maximum level indicating that the actuator should shut off a heat source, open a pressure relief valve, or close a water valve. If all your intelligence is in the Cloud, that signal must travel up through the access, aggregation, and other layers, up to the Cloud for data analysis. A decision is made, and then signaling is sent back down to the actuator. With applications like this, we need answers now. We want Intelligence at the Edge. This Intelligence at the Edge is a key idea behind Aruba's Edge Services Platform or ESP, which I'll touch on soon, but first, I want to mention an issue with management platforms. I've seen quite a few management systems over the years, many of them quite good at what they do, but they only do so much. One system gives me great visibility into the wireless network, another helps manage switches and routers, and perhaps another system for remote branch connectivity. It can be hard to be an expert in one of these systems, much less all three. These separate silos bite us in another way as we move outside the campus between multiple campuses, remote branches, and on into nationwide deployments. We might have a set of management systems in the North, South, East, and West campuses across our country. With separate management systems, key opportunities are lost. But what if we had a common data lake for the entire deployment with AI and ML analyzing this data for trends, we could more easily answer questions like, why does the West Coast campus have 12 percent more end-user complaints related to Wi-Fi systems? Why is voice-over IP latency higher in the Northern branch? What about security? Look what I'm asking of you so far. You have a classic campus network. Now, add some Cloud services, throw in some IoT devices, and accommodate Internet, branch office, SOHO, and remote teleworker connectivity. Each new requirement adds a potential weakness for exploitation, and bad actors devise more clever ways to attack us every day. Now, recall the days when security meant adding a firewall to your perimeter, configuring a few dozen or a 100V lens, and some access control lists or ACLs to lock things down. Then we began to realize that most attacks came from inside the network. With all these new potential attack vectors, we need security baked into the very fabric of our campus networks. Later in this series, I'll talk about some specific solutions to consider but that wraps it up for the first video in this Modern Campus Network Management Techniques series. Please continue with the next video where you explore classic network management systems or NMS tools.