Kamis, 13 Oktober 2016

Is iBPMS the Secret Ingredient for Agile Digital Transformation? fifianahutapea.blogspot.com

Enterprises that are in the throes of digital transformation are quickly beginning to realize that agility has become a key factor for success. Nowhere is this truer than in the arena of DevOps, where digital transformation often means recoding applications, incorporating the cloud, leveraging APIs, and supporting the burgeoning mobile and remote markets.

Perhaps the secret to solving those challenges comes in the form of low-code or no-code application development tools, something that Appian Founder, Chairman & CEO Matt Calkins fully believes. Calkins said “Developers are embracing Low-Code Platforms to collaborate with business owners to deliver sustainable apps faster.”

Naturally, Calkins has a vested interest in the low code ideology, after all Appian is a Business Process Management (BPM) platform provider, seeking to topple the apple cart, with Appian Quick Apps, a product that transforms how applications are built, deployed and modified. Calkins added “The goal with Appian Quick Apps is to keep users from getting bogged in specifying granular actions when building the core app while maintaining customizability.”

Calkins isn’t alone in his observations, both Forrester and Gartner, well known research houses, agree with Calkins assessments, offering that BPM tied with agility is the key for digital transformation success. However, there is a lot more to the story of low-code & no-code solutions than drag and drop simplicity. It all comes down to the creativity that such a development environment can deliver.

Calkins added “Low-Code Platforms free up software developers from mundane tasks like adding fields to forms or testing mobile apps so they can do more of what they love: creating and inventing.” Capabilities that bring experimentation and agility to the forefront of BPM. Interestingly, BPM has become a bit of a misnomer in this arena, as low-code platforms can do much more than just automate business processes, something that adopters are starting realize.

Case in point is operational advantages that major wireless carrier Sprint-Nextel discovered by entrusting their digital transformation to low code ideologies. Sprint-Nextel was able to quickly rework how they deploy and manage their growing wireless network. The company turned to an ideology of deploying some 70,000 “mini-macros”, small amplifiers to boost coverage. Which proved to be a complex legal, regulatory, technological and overall program management endeavor.

In the past, provisioning something along the lines of a new cell tower could take up to 30 days, simply because of inefficiencies found in process and data management. Sprint switched to a BPM solution, more specifically an iBPMS (intelligent business process management suite) to improve and accelerate the 8,000 business processes involved in provisioning a new cell site. Thanks to a low-code, no-code ideology built into the iBPMS, Sprint was able to reduce data collection, analysis and reporting times from 30 days to 7 days, with a solution that they were able to deploy in just 3 weeks.

Sprint isn’t alone in the race to innovation, countless other organizations are turning to iBPMS to realize the agility needed today for successful digital transformation. One such organization is Sanofi, the 4th largest pharmaceutical company in the world. Sanofi’s goal was to achieve faster clinical trial start-ups, a key competitive advantage for getting new drugs to market. Sanofi approached the ideology of digital transformation by looking at ways to automate the numerous steps involved for getting a clinical trial started.

Adopting an iBPMS solution that leveraged a low code approach enabled Sanofi to reduce clinical trial startup times from six months to just two months. What’s more Sanofi was able to reduce exposure to risk, by making sure every procedural step met the strict filing deadlines for compliance with regional governing bodies.

While Sprint and Sanofi prove the value of iBPMS with low code methodologies, the applicability of iBPMS is much broader. Most any organization can benefit from the agility provided by low-code solutions, agility that is magnified when coupled with BPM. It is that realization that makes iBPMS the secret sauce for digital transformation, simply by liftng the constraints of high-code environments and manual tasks and transforming process creation into something akin to plug and play simplicity. After all, organizations are better served by their developers focusing on logic and not the particular nuances of a complex development environment.

 

Easy Way to Download

Jumat, 07 Oktober 2016

The Future of Immersive Environments: Virtual Home Design, “Backcasting” the Future and a Look at How VR/AR Get Social fifianahutapea.blogspot.com

At the Gigaom Change conference in Austin, Texas, on September 21-23, 2016, Dr. Jacquelyn Ford Morie (CEO of All These Worlds), Melissa Morman (Experience Officer at BDX), Liam Quinn (CTO of Dell), and Doreen Lorenzo (Director of UT Austin’s Center for Integrated Design) talked about empathetic design in virtual space and the future of augmented reality.

The future is already here, but there is much more to come in terms of more fully immersive environments. Virtual and augmented reality (VR/AR) will proliferate in digital spaces, taking us from a two-dimensional interface to three-dimensional virtual spaces. But once these virtual and augmented environments are ubiquitous, what will we do, how will we react and what new things will we learn?

One of the areas where we’ll see some of the biggest changes is the home.

Melissa Morman, Client Experience Officer at BDX, is looking at ways homebuilders can adopt and deliver more digital experiences for their customers. Mormon said she is scouting new technologies for the homebuilding industry by asking questions like, “How do you attract customers digitally?”

Currently, prospective homeowners are given floorplans to help them evaluate (and visualize) a new home. But when the home isn’t built or significant changes are being made, floor plans can’t do the job. Smart builders understand this and are looking at ways of using virtual and augmented reality tools to help clients see the possibilities.

Donning an Oculus Rift headset, customers are digitally immersed into the virtual home and are able to make adjustments to colors, materials and even the physical configurations of the rooms. Need to make a hallway wider for wheelchair access? Want to see what your countertop looks like with another color of granite? All of these changes can be visualized in great detail.

Once inside these immersive environments, how might we react though? What will our emotional responses be and how can those be used in creative ways?

Dr. Jacqueline Ford Morie said that “VR let’s you experience walking a thousand miles in someone else’s shoes. It’s powerful as a tool for empathy.” She cited a project called “Hunger in LA” which recasts the participant in a reconstructed scene of a real-life man who has collapsed in line at a food bank. This project was ground-breaking as a journalistic approach to creating empathy and understanding.

The panel moderator and director of the UT Austin Center for Integrated Design, Doreen Lorenzo, agreed that there is a huge opportunity for designers to use VR and AR to “step inside” the world of the user and really understand what they need — whether you’re designing for someone with disabilities or understanding the specific needs of a group. Morie agreed, saying, ”We’re starting to use a lot of VR for health reasons so it can be life-changing. That’s coming.”

But this is all a single-person experience. The perception of VR is that it’s anti-social. Can we expect to see social, virtual experiences?

Morie mentioned a project called Placeholder as a great example of some of the earliest social VR work ever done (the project is led by Computers as Theater author and researcher, Brenda Laurel). Filling the role of different spirit animals, you and a group of your friends can talk to one another and leave each other messages in the larger scope of the game. There are also opportunities to have richer, more immersive experiences — diving under the water as fish or soaring in the clouds as a bird. “VR is social, not anti-social,” she said.

If VR is temporary immersive experiences, then AR is always with us. We can imagine this as constantly accessible informational overlays. Imagine a mechanic working on a part with a virtual manual right in front of them. But further in the future, AR has the potential to go beyond simple overlays. In a world that merges AR and VR, they’ll create a mixed reality (MR) that is seamless and fluid.


Quinn said they were already starting to see aspects of this vision with Dell’s Smart Desk for creative professionals. Dell is developing business applications for augmented reality that will allow IT departments do things like remote technical support with augmented overlays. They’re also working with automotive and airline partners to create mixed reality environments for their customers, creating ever-richer experiences for engage.

By Royal Frasier, Gryphon Agency for Gigaom Change2016

Easy Way to Download

Kamis, 06 Oktober 2016

Enchanting Products and Spaces by Rethinking the Human-Machine Interface fifianahutapea.blogspot.com

At the Gigaom Change conference in Austin, Texas, on September 21-23, 2016, David Rose (CEO of Ditto Labs, MIT Media Lab Researcher and author of Enchanted Objects), Mark Rolston (Founder and Chief Creative Officer at argodesign) and Rohit Prasad (Vice President and Head Scientist, Alexa Machine Learning) spoke with moderator, Leanne Seeto, about “enchanted” products, the power of voice-enabled interactions and the evolution of our digital selves.

There’s so much real estate around us for creating engaging interfaces. We don’t need to be confined to devices. Or at least that is the belief of Gigaom Change panelists, David Rose, Rohit Prasad and Mark Rolston, who talked about the ideas and work being explored today that will change the future of human-machine interfaces creating more enchanted objects in our lives.

With the emergence of Internet of Things (IoT) and advances in voice recognition, touch and gesture-based computing, we are going to see new types of interfaces that look less like futuristic robots and more like the things we interact with daily.

Today we’re seeing this happen the most in our homes, now dubbed the “smart home.” Window drapes that automatically close to give us privacy when we need it is just one example of how our homes and even our workspaces will soon come alive with what Rose and Rolston think of as Smart-Dumb Things (SDT). One example might be an umbrella that can accurately tell you if or when it’s going to rain. In the near future devices will emerge out of our phones and onto our walls, furniture and products. We may even see these devices added to our bodies. This supports the new thinking that devices and our interactions with them can be a simpler, more seamless and natural experience.

Rose gave an example from a collaboration he did with the architecture firm Gensler for the offices of Salesforce. He calls it a “conversational balance table.” It’s a device that helps subtly notify people who are speaking too much during meetings. “Both introverts and extraverts have good ideas. What typically happens, though, is that during the course of a meeting, extraverts take over the conversation, often not knowingly,” Rose explains, “so we designed a table with a microphone array around the edge that identifies who is speaking. There’s a constellation of LEDs embedded underneath the veneer so as people speak, LEDs illuminate in front of where you are. Over the course of 10 or 15 minutes you can see graphically who is dominating the conversation.”

So what about voice? Will we be able to talk to these devices too? VP and Head Scientist behind Amazon Alexa, Rohit Prasad, is working on vastly improving voice interactions with devices. Prasad believes voice will be the key feature in the IoT revolution that is happening today. Voice will allow us to access these new devices within our homes and offices more efficiently. As advances in speech recognition continue, voice technology will become more accurate and able to quickly understand our meaning and context.

Amazon is hoping to spur even faster advances in voice from the developer community through Alexa Skills Kit (ASK) and Alexa Voice Service (AVS), which allow developers to build voice-enabled products and devices using the same voice service that powers Alexa. All of this raises important questions. How far does this go? When does voice endow an object with the attributes of personhood? That is, when does an object become an “enchanted” object?

At some point, as Mark Rolston of argodesign has observed, users are changed in the process of interacting with these objects and spaces. Rolston believes that our digital selves will evolve into entities of their own — what he calls our “meta me,” a combination of both the real and the digital you. In the future Rolston sees our individual meta me’s as being more than just data, but actually negotiating, transacting, organizing, and speaking on our behalf.

And while this is an interesting new concept for our personal identity, what is most interesting is using all of this information and knowledge to get decision support on who we are and what we want. The ability for these cognitive, connected applications to help us make decisions in our life is huge. What we’re moving toward is creating always-there digital companions to help with our everyday needs. Imagine the future when AI starts to act as you, making the same decisions you would make.

As this future unfolds, we’re going to begin to act more like nodes in a network than simply users. We’ll have our own role in asking questions of the devices and objects around us, telling them to shut off, turn on, or help us with tasks; gesturing or touching them to initiate some new action. We’ll still call upon our smartphones and personal computers, but we won’t be as tethered to them as our primary interfaces.

We’ll begin to call on these enchanted devices, using them for specific tasks or even in concert together. When you ask Amazon’s Echo a simple question like “what’s for lunch?” you won’t be read a lengthy menu from your favorite restaurant. Instead, your phone will vibrate letting you know it has the menu pulled up for you to scroll through and decide what to eat. Like the talking candlestick and teapot in Beauty and The Beast, IoT is going to awaken a new group of smart, interconnected devices that will forever change how we interact with our world.

By Royal Frasier, Gryphon Agency for Gigaom Change 2016

Easy Way to Download

Rabu, 05 Oktober 2016

Artificial Intelligence: It’s Not Man vs. Machine. It’s Man And Machine fifianahutapea.blogspot.com

At the Gigaom Change conference in Austin, Texas, on September 21-23, 2016, Manoj Saxena (Chairman of CognitiveScale), Josh Sutton (Head of Data & Artificial Intelligence at Publicis Sapient) and Rob High (CTO for IBM Watson) talked with moderator and market strategist, Patricia Baumhart, about the next frontier in artificial intelligence and how the race to win in AI will soon reshape our world.

Artificial intelligence is a field with a long history starting as early as 1956, but today what we’re beginning to see emerge is a new convergence of 6 major technologies: AI, cloud, mobile, social, big data and blockchain. Each of the panelists agreed that as we enter into the next digital frontier, AI will be woven into each of these areas causing a “super-convergence” of capabilities.

Saxena predicts that “this age of the Internet is going to look small by comparison to what’s happening in AI.” It’s true. The proliferation of AI creates a new world of application and computation design, including embodied cognition in concierge-style robots that help when we need assistance.

Cloud will become “cognitive cloud,” a ubiquitous virtual data repository powered by a “digital brain” that understands human needs to help us engage with information seamlessly in work and life. Big data will evolve from being about understanding trends to understanding and predicting outcomes. In combination these developments will disrupt enterprise IT and other business models across the world.

But as we move from a “mobile first to an AI first” landscape, how do we differentiate the winners from the losers? And how can investors know where to place their bets?

Trust and transparency are going to be the two most critical pieces of winning applications. Imagine a hedge fund manager using AI algorithms to develop a financial strategy for their portfolio. Before placing millions of dollars at risk, that manager will need an explanation of why the AI chose a particular solution.

We’re seeing companies like Waze do this already. Beyond being a great way to navigate, Waze is a contextually aware, predictive computing platform that anticipates what information you need next based on your location and route. More applications in different industries — from healthcare, to retail, to personal finance — will soon act like Waze, using cognitive computing and context to constantly learn and anticipate what we need.

The businesses that will win are the ones that apply AI capabilities not just to automate their processes, but that use AI to run their business in a fundamentally different way.

First, we have to understand the areas that AI can best be applied. The challenge in cognitive computing is interpreting and understanding the oftentimes imprecise language we use as humans. As High pointed out in the panel, “our true meaning is often hidden in our context.” AI needs to be able to learn from these conditions to gain meaning.

It’s not a question of who has the best technology, but who has the best understanding and appreciation of what the technology can unlock. The people who will gain the most from AI are the ones who are rethinking their business processes, not just running their existing businesses better.

As more of our lives are aided by intelligent systems in our homes, at work, and in our cars, other questions arise. Will AI get so smart that it replaces us? Sutton, High and Saxena all agree “no,” but they say that some tasks will certainly become automated. They believe the more important change will be the creation of a new class of jobs. According to Forrester, 25% of all job tasks will be offloaded to software robots, physical robots, or customer self-service automation — in other words, all of us will be impacted in some way. But while that may sound disparaging, the same study states that 13.6 million jobs will be created using AI tools over the next decade.

The nature of work will change dramatically with AI. We’ll have technology that augments our skills and abilities — perhaps something like a “JARVIS suit” that allows us to be superhuman. We’ll work alongside robotic colleagues that help us with our most challenging tasks. In terms of cognitive computing, we’re talking about amplifying human cognition, not replacing the human mind. There is so much to be gained when we uncover ideas and solutions we wouldn’t have been able to do on our own.

Today 2.5 exabytes of data are being produced every day. That number is expected to grow to 44 zettabytes a day by 2020. Like an actual brain — a super-complex network of biological components that learns and grows with experience — these interconnected data points, along with the machine learning algorithms that learn and act upon them on our behalf, are the building blocks of our AI-powered future.

By Royal Frasier, Gryphon Agency for Gigaom Change 2016

Easy Way to Download

Selasa, 04 Oktober 2016

Millennials and the Workplace fifianahutapea.blogspot.com

Recent research undertaken by Dell and Intel explored a wide range of issues regarding the changing landscape of the modern workplace. In this post, I explore a number of the study’s findings that focused closely on the wants, needs, and concerns of Millennials and the workplace.

One of the major findings of this research was that Millennials’ attitudes toward technology in the workplace are quite different than other cohorts. It turns out that Millennials are about 10% more tech-oriented and collaborative than the norm. But that’s just the starting point.

The Research

Penn Schoen Berland, on behalf of Dell and Intel, conducted 3801 online interviews across nine international markets between April 05 and May 03, 2016. The respondents included 2050 men, and 1751 women, and segregated those as Millennials, 1412, and Non-Millennials, 2389.

Millennials are like everyone else, only more so

millenials-and-workplace-fig1

Millennials believe the workplace is becoming more collaborative, and they are willing to use new technologies at a higher rate, such as virtual sharing, augmented and virtual reality. They are more inclined to believe that face-to-face interaction will become obsolete — a finding that I disagree with in part, because it is contradicted in other studies where Millennials have expressed a higher than average desire for f2f meetings over other forms, for example. Nonetheless, these findings collectively show a strong lean forward toward adopting future technologies. (And I wonder if we will see an additional 10% difference with Gen Z workers, just starting to join the workforce? Or will they head in a different direction?)

The rightmost comparison in the chart above shows a sizeable minority of Millennials would consider quitting a job if the technology was below their idea of acceptable, again, at a rate of 10% above the baseline. And when contrasted with the 35 and older cadre, the difference is even more stark:

millenials-and-workplace-fig-2

Likewise, as you would expect, workplace technology has an even bigger influence on Millennials’ decisions regarding accepting a job, with 82% saying it would have an influence:

millennials-and-workplace-fig-3

There is little doubt that Millennials are eager to embrace new technologies.
The research discovered that four in five Millennials feel that having access to technology at work makes it easier for them to perform their duties:

millennials-and-workplace-fig-4

The research showed that there is about a quarter (28%) of Millennials who believe they are much more collaborative than they used to be, but 79% believe workplaces are more collaborative than in the past:

millennials-and-workplace-fig-5

Millennials have strongly futures-leaning perceptions about technology’s role in the workplace, with a strong majority looking forward to higher levels of virtual sharing (73%), smart offices (70%), and virtual/augmented reality (67%):

millennials-and-workplace-fig-6

Conclusions and Takeaways

We’ve seen in this research strong evidence for the notion that Millennials’ attitudes toward technology are more positive across the board, and that they believe in the promise of future technology to increase collaboration, make it easier to do their jobs, and decrease the reliance on face-to-face interactions.

Therefore, companies — and their leaders — have an implicit need to support the technological hunger of Millennials. We’ve seen in several of the findings from this study more explicit implications for leadership. Specifically this: to attract and retain the best and brightest Millennials, companies will have to make better than average investment in new technologies… or else. Clearly, Millennials will reject companies that fall short in this area.

We shouldn’t forget the other workers, like the over 35 year-olds. While they are somewhat less over the top about new technologies, large numbers still want higher levels of productivity, collaboration, and smarter workplaces.

So management must find ways to push the level of technology so that all the cadres of the workforce — Millennials and non-Millennials alike —  benefits from workplaces that are smarter, more collaborative, and more productive, for all involved.

Easy Way to Download

The future of retail will need people, and this blog post shows why fifianahutapea.blogspot.com

Cotswold Outdoor is a hiking and outdoor retailer based in the UK. It also happens to be the place of work for a sales assistant, who goes by the name of Big Dave, and who went beyond the call of duty for a blind customer and his helper. You can read about it yourself – it’s on the Facebook page for the store.

Now, sure, we all love a good story with a happy ending. But this goes deeper, particularly when we consider the tribulations retail has been facing over recent years. Going beyond the call of duty might be precisely what enables some stores – certainly those who differentiate on service rather than margins – to survive. Deeper still, it goes to the heart of questions about the nature of work and whether many jobs, in particular customer service jobs, will be automated out of existence.

On this latter point, the prevailing mood is currently pessimistic – a fact which led me to jot down ten reasons why nobody would be out of a job. That post spawned some great comments, notably from Kirby who argued the opposite. I believe, however, that Kirby missed the underlying point to the ten reasons. Simply put, it’s that we help each other because we are programmed to do so as a race, and we are also programmed to expect something in return. Money simplifies this but doesn’t change it.

Since I wrote the post I have been speaking to industry expert Vinnie Mirchandi, who has been spending considerable time cataloguing jobs and looking at the impact of automation for, and who has been kind enough to send me a review draft of the resulting book. I have yet to read it all but if I could capture the conclusion in a two words it would be expect augmentation – as Vinnie says, “The end result is an optimistic read on the changing nature of work, a celebration of outstanding workers, and the machines which are making them even better.”

We are descended from a heritage of outstanding workers, an ancient truth which has taken a bit of a hit since the industrial revolution kicked off the automation game. Work gives us meaning, and makes us feel valued, and we probably couldn’t stop doing so even if we wanted to – ask any retiree who ended up volunteering, writing their memoirs or otherwise pursuing a worthy endeavour.

Will work change? Of course it will, and already, profoundly has. Re-skilling will become the norm, rather than the exception. But to suggest we face a future where work is no longer a thing, is to fail to understand what makes us human. And just as we will always have work, so will we always have outstanding workers such as Big Dave to celebrate.

Easy Way to Download

Microsoft gives up on Band? fifianahutapea.blogspot.com

Various sources — including Mary Jo Foley at ZDNet — report that Microsoft has pulled all references to the Band fitness devices from the Microsoft Store online. She reports that the company responded to questions about the product with this:

We have sold through our existing Band 2 inventory and have no plans to release another Band device this year. We remain committed to supporting our Microsoft Band 2 customers through Microsoft Stores and our customer support channels and will continue to invest in the Microsoft Health platform, which is open to all hardware and apps partners across Windows, iOS, and Android devices.

I spoke with Christina Chen, then Microsoft’s General Manager, Emerging Devices Experiences, back in February. Reviewing my notes, we spoke almost exclusively about watches, and the Band never came up. She left Microsoft in April, and is now product director at YouTube gaming. Hmmm.

At any rate, it looks like Microsoft is regrouping on wearables, although maybe it’s just doubling down on sectors where it has a real play, like Hololens.

Easy Way to Download

Senin, 03 Oktober 2016

The Internet of Things is yet to arrive at the starting blocks of innovation fifianahutapea.blogspot.com

“We are but puny dwarfs perched on the shoulders of giants. We see more and farther… not because we have keener vision or greater height, but because we are lifted up and borne aloft on their gigantic structure.” Bernard of Chartres, died c.1124 (via John of Salisbury, 1159)

It may seem strange to trace the origins of the digital age back 900 years, but humanist and philosopher Bernard of Chartres nails it. Not only did he foresee the main tenet of the Platform Economy but he offered a useful framing of the potential of the Internet of Things, based on the relationship between eternal ideas and material objects.

Stretching a point? D’ya think? It’s worth reviewing how far we have got with the with the IoT so far. While we’ve seen a considerable amount of effort to standardise in the platform level, we are still a long way from providing the ‘giants’ upon which the broader section of us lowly creatures can innovate.

That’s not to understate the considerable effort that has already been made. In the Platform as a Service layer, Amazon AWS IoT, IBM Watson and Microsoft’s Azure IoT Suite, Zebra and a host of smaller players such as Thingworx and Evrythng offer massively scalable and open integration, streaming, storage and analytics capabilities.

In industry, the likes of Fujitsu (with GlobeRanger)  and Bosch have things going on; meanwhile Intel has an IoT Platform reference architecture to which a number of vendors have subscribed, including GE with its Predix industrial IoT framework and services. How easy and unfair it is, one might say, to suggest that such efforts are not already substantial.

But while such platforms and standardisation efforts are taking us way beyond where we have been, they are yet to arrive at a point where the real innovation explosion will take place. Solutions are currently domain-specific, frequently proprietary and a long way from the adoption levels seen by, say, social media.

Perhaps the closest is Xively or even IFTTT, but none have the immediacy of their social networking. In wearables for example, Garmin, Strava et al continue to fight their corners. Apple just added a ‘home’ icon to its mobile device screens, but it may have left many scratching their heads as to what it was for (as did my wife). The challenges currently faced by Nest reflect the ‘solution without a problem’ stage we are in.

This isn’t a complaint. If I had a concern at all, it’s whether we are prepared for the wave of joined-up connectedness that will inevitably hit. Today’s ‘advances’ will be seen as a world-spanning gestation, a global laying of smart infrastructure upon which the next two decades of innovation will be built.

No user-facing ‘smart’ portal has been adopted to any extent — while some (such as Fluke) have mentioned a ‘Facebook of Things’, we are yet to see a billion-user go-to page to access and control our smart devices, either in work or at home. But we will, as sure as birth follows pregnancy.

Of course, this suggests that such an opportunity is sitting on the table. Why the Googles, Facebooks, Microsoft and indeed, Alibabas aren’t ripping their gloves off and fighting tooth and nail to gain this position is quite astonishing. Once they do (and are joined by whichever next-upstart-to-become-a-household-name in the process), we will enter a new phase of innovation, thrilling and downright scary in equal measure. Lives will be saved, even as the rights of individuals significantly undermined.

For we are but puny, in the face of such developments. But we will see more, and farther than the giants themselves. The relationship between eternal ideas and material objects is about to get a significant run for its money.

Easy Way to Download

Selasa, 27 September 2016

Review: DB Networks Enhances Database Security with Machine Learning fifianahutapea.blogspot.com

Protecting databases takes more than just securing the perimeter, it also takes a deep understanding of how users and applications interact with databases, as well as knowing what databases are alive and breathing on the network. DB Networks aims to provide the intelligence, analytics and tools to bring insight into the database equation.

It’s no secret that database intrusions are on the rise, much to the chagrin of those responsible for infosec.  While many have focused on the notions of protecting the edge of the network and wrapping additional security around user access, the simple fact of the matter is that databases are the primary storehouses of private and sensitive information, and are often the true targets of intruders.

Recent events, such as the Target breach, the theft of security clearance information from the US OPM (Office of Personnel Management) and the theft of medical records from Anthem Healthcare, illustrates that protecting sensitive data is quickly becoming a losing battle. DB Networks is taking steps to turn the tide and bring victory to those charged with protecting databases.

The San Diego based company offers their DBN-6300 appliance and its virtual cousin, the DBN-6300v as founts of database activity, analytics, and discovery to give today’s security professionals an edge in the ever growing cyberattacks that are targeting databases. Those products promise to equip security professionals and database administrators with the tools that can identify and mitigate breaches before irreparable damage is done.

Case in point is the ubiquitous sql injection attack, which is far more common than most will admit to. SQL injection attacks have been around for more than ten years, and security professionals are more than capable of protecting against them. However, according to Neira Jones, the former head of payment security for Barclaycard, some 97 percent of data breaches worldwide are still due to an SQL injection somewhere along the line.

Taking a Closer Look at DBNetworks IDS-6300:

I recently had a chance to put DBNetworks IDS-6300 through its paces at the company’s San Diego Offices. The IDS-6300 is a physical appliance, built on Intel Hardware as a 2U rack mountable server. The device features four 10/100/1000 Ethernet Ports for data capture, one 10/100/1000 Ethernet admin port and one 10/100/1000 Ethernet customer service port, as well as a 480Gb SSD and 2Tb archival storage.

The device can be deployed by plugging it into either a span port or a tap port located at the core switch in front of the database servers. The idea is to place the device, logically ahead of the database servers, yet behind the application servers, so it can focus on SQL traffic. The IDS-6300 is managed via a browser based interface and supports the Chrome, Firefox and Safari browsers and will fully support IE in the near future.

I tested the device in a mock operational environment that included MS-SQL Databases with a demo version of a banking application that incorporated some known vulnerabilities. Setting up the device entailed little more than defining the capture ports and some very basic post installation items. Once configured to capture data, the next step was to identify databases.

Here, the IDS-6300 does an admirable job; it is able to automatically discover any databases that experience any traffic, even simple communications, such as a basic SQL statement. The device monitors for traffic 24/7 and continually checks for database activity.

That proves to be a critical element in the quest for securing databases – according to company representatives, many customers have discovered databases that IT was unaware operating in production environments. What’s more, the database discovery capability can be used to identify rogue databases or databases that were never shutdown after a project completed.

The database discovery information offers administrators real insight into what exactly is operating on the network, and what is vulnerable to attack – knowing that information can be the first step in mitigating security problems, before even venturing into traffic analysis and detection.

Never the less, the product’s real power comes into play when detecting SQL injection attacks. Instead of using caned templates or signatures, the IDS-6300 takes SQL attack detection to the next level – the device is able to learn what normal traffic is and record/analyze what that traffic accomplishes, and then builds a behavioral model.

Simply put, the device learns how an application communicates with a database, that information is used to create a behavioral model. Once learning is completed, the device uses multiple detection techniques to validate future SQL statements against expected behavior.  In practice, behavioral analysis proves immune to zero day attacks, newly scripted attacks and even old, recycled attacks, because all of those attacks fall out of the norms of expected behavior.

That behavioral analysis eliminates the need for signatures, black lists, white lists and other technologies that rely on pattern matching or static detection, which in turn reduces operational overhead and maintenance chores, almost converting SQL Injection attack monitoring into a plug and play paradigm.

When SQL Injection attacks occur, the IDS-6300 captures all of the traffic and transaction information around that attack. What’s more, the device categorizes, analyzes and presents the critical information about the attack so that administrators (or application engineers) can modify database code or incorporate firewall rules very quickly to remediate the problem.

Which brings up another interesting point, the IDS-6300 proves to be a good candidate for helping organizations improve application code. With many businesses turning to outsourcing and/or modifying off the shelf/open source software for application development, situations may arise where due diligence is not fully implemented and agile development projects may lead to introducing security flaws into application code.  That is not an uncommon problem,  at least according to Forrester Research’s Manatosh Das –  Poor application coding persists despite lessons learned.  Das claims that more than two-thirds of applications have cross-site scripting vulnerabilities, nearly half fail to validate input strings thoroughly, and nearly one-third can fall foul of SQL injection. Das adds security professionals and software engineers have known about these types of flaws for years, but they continue to show up repeatedly in new software code.

The IDS-6300 will quickly detect those newly introduced flaws and prevent poor programing practices from creating vulnerabilities, and then provide the information that is needed to fix those flaws.

The IDS-6300 offers another advantage to customers; it can help customers to consolidate databases by identifying what databases are active and what they are used for. That in turn can lead to companies combining databases and significantly reducing licensing and support costs. DBNetworks reports that one of their customers were able to reduce database licensing costs by over $1,000,000 by detecting and consolidating databases that were discovered by the IDS-6300

The IDS-6300 starts at $25,000 and is available directly from DBNetworks and authorized partners. For more information, please visit DBNetworks.com

 

 

Easy Way to Download

Kamis, 22 September 2016

Performance Management Brings New Found Value to IT fifianahutapea.blogspot.com

IT departments are always struggling to garner the praise they deserve. Yet, most organizations look upon IT as a necessary evil, one that is both expensive and somewhat obstructionist. However, nothing could be further from the truth, and IT departments the world over have pursued ideologies that highlight the value of the services they offer, while also demonstrating the importance that a properly executed IT management plan brings to the bottom line.

At last weeks Riverbed Disrupt event, GigaOM had a chance to talk with CIOs, as well as network managers that have demonstrated the value of IT with application performance management platforms and services.

John Green, Chief Information Officer at Baker Donelson, the 64th largest law firm in the country, offered some real world examples of how Application Performance Management (APM) and end user monitoring bring demonstrable value to an organizations IT department.

Green said “my staff supports some 275 different applications and more than 40 video conferencing rooms, which are in near constant operation.” Simply put, Green has come to know the importance of how reliable service and end acceptable user experience impacts the view that the firm’s 1,500 employees have of the IT department.

Green said “I was deploying the best technology money could buy, but my end-users still weren’t happy.” Green was looking at a situation where unhappy end users could create dire circumstances, which could impact the firms bottom line. Green added “I could go to management meetings and offer proof that the networks were up 99.9% of the time, and the that the databases and the email servers were delivering five-nine statistics of operation. Yet, my end users were still complaining.”

That is when Green had an epiphany, one that amounted to realizing network performance statistics and end user expectations rarely do not go hand in hand. Green said “We needed the ability to track the actual end-user experience, and then use that information to meet user expectations.”

Green found those much desired capabilities with SteelCentral Aternity, a product that offers the ability to monitor any application on any device to provide the actual user perspective, at least when it comes to responsiveness and performance. Green said “I have been an Aternity user for about seven years, and it completely transformed the way we relate to our end users.”

Nonetheless, Green said “Aternity is only one part the puzzle, although it provides valuable information, I would like to see the whole performance and experience picture on one pane of glass.”

That was a need that brought Green to the Riverbed Disrupt event. Riverbed recently purchased Aternity and is integrating the technology into their SteelCentral product line, looking to give its customers that single pane of glass view. Green was impressed with the direction Riverbed is taking with end-to-end monitoring and offered ““With the Riverbed and Aternity combination, there is now a mix of tools, that when combined into a single pane of glass, gives you total visibility across your network, from the servers to the circuits.”

While the Riverbed event was about new technologies, the real message was that by providing full monitoring capabilities to IT, staffers can better serve end-users and demonstrate the value of effective IT.

 

 

 

Easy Way to Download

Selasa, 20 September 2016

Riverbed Demonstrates the Importance of Full Stack Monitoring fifianahutapea.blogspot.com

Complete end to end monitoring has become increasingly important as enterprises strive to move from legacy data centers to the promise of software defined environments. After all, network managers encumbered by missing pieces of the network connectivity puzzle are likely to fail the transition to software defined solutions. An observation made abundantly clear at Riverbed’s Disrupt Event held in Manhattan last week. Overcoming the obstacles of connectivity has become Riverbed’s clarion call, and the company is now offering comprehensive solutions that not only ease the transition to software defined solutions, but also bring much more control and information to the network management realm.

Case in point is the company’s move to products that embrace the ideologies of a Software Defined Wide Area Network (SD-WAN), such as the company’s SteelConnect 2.0, an application-defined SD-WAN solution. In an interview with GigaOM, Joshua Dobies, vice president of product marketing at Riverbed, said “the new capabilities offered allow branch offices to directly access the cloud, all without having to backhaul everything back to the data center.” Dobies added “SD-WAN paves the way for complete digital transformation, allowing enterprises to quickly access the benefits of the cloud, while not discarding their existing investments in Data Center Technologies.”

Of course, the wholesale movement to the cloud means that technologies must transition to platforms that enable transformation, without incurring disruption. A situation that proves to be the sweet spot for end to end monitoring. With the addition of full network visibility, along with end user experience monitoring, network managers now have the ability to identify connectivity and performance problems on the fly, and can quickly address those problems with policies and tuning.

With the introduction of Riverbed’s next version of its SD-WAN offering, SteelConnect 2.0, the company is giving its customers greater visibility throughout the network, thanks to integration with Riverbed’s SteelCentral, it’s end-to-end performance management platform, and SteelHead products and Riverbed’s Interceptor offering, which gives SteelConnect greater scale for dealing with larger enterprise deployments. Riverbed Chairman and CEO Jerry Kennelly said “Today, we’re delivering a software-defined architecture for a software-defined world, and expanding that infrastructure deeper into the cloud and more broadly across all end users.”

In addition to the new SteelConnect 2.0 release, SteelCentral, it’s end-to-end performance management platform will now incorporate technology from Aternity, which Riverbed acquired in July. Aternity brings the ability to monitor application performance on physical and mobile end-user devices to the SteelCentral product line. The addition of the Aternity technology and extending visibility into the end-user devices give Riverbed a full portfolio of management offerings, according to Nik Koutsoukos, vice president of product marketing at Riverbed. “This brings full end-to-end management capabilities to those who need it most” Koutsoukos told GigaOM.

 

Easy Way to Download

Senin, 19 September 2016

Survey Reveals InfoSec is Doing it all Wrong! fifianahutapea.blogspot.com

While, “doing it all wrong” may be an exaggeration, no one can deny the fact that breaches are on the rise, and IT security solutions seem to be falling behind the attack curve. Yet, those looking to place blame may need only look in the mirror. At least that what a survey from cyber security vendor BeyondTrust is indicating.

BeyondTrust surveyed Over 500 senior IT, IS, legal and compliance experts about their privileged access management practices. The survey revealed some interesting trends, some of which should fall under the banner of “they should know better”. For example, only 14 percent regularly cycle their passwords, meaning that 86 percent of those surveyed are avoiding one of the top best practices for password and credential management. Adding insult to injury, only 3 percent of those surveyed monitor systems in real-time and have the capability to terminate a live session that may be indicative of a breach.

Simply put, the survey indicates that the majority of organizations need to do much more to protect systems from breaches. Many of which, could be easily avoided if the proper policies are put into effect. That said, the survey also revealed that 52 percent of respondents are not doing enough about known risks. In other words, they understand what the risks are, but have not deployed the technologies or crafted the policies to mitigate those risks.

Mitigating those risks should be one of the top jobs of InfoSec today, especially since most of the identified risks can be quickly resolved, using off the shelf products and by just applying best practices. BeyondTrust has developed some recommendations that InfoSec professionals can take to heart to lower risk and harden systems from breaches.

Those recommendations include:

  • Be granular: Implement granular least privilege policies to balance security with productivity. Elevate applications, not users.
  • Know the risk: Use vulnerability assessments to achieve a holistic view of privileged security. Never elevate an application’s privileges without knowing if there are known vulnerabilities.
  • Augment technology with process: Reinforce enterprise password hygiene with policy and an overall solution. As the first line of defense, establish a policy that requires regular password rotation and centralizes the credential management process.
  • Take immediate action: Improve real-time monitoring of privileged sessions. Real-time monitoring and termination capabilities are vital to mitigating a data breach as it happens, rather than simply investigating after the incident.
  • Close the gap: Integrate solutions across deployments to reduce cost and complexity, and improve results. Avoid point products that don’t scale. Look for broad solutions that span multiple environments and integrate with other security systems, leaving fewer gaps.

 

In an interview with GigaOM, Kevin Hickey, President and CEO at BeyondTrust, offered “Companies that employ best practices and use practical solutions to restrict access and monitor conditions are far better equipped to handle today’s threat landscape.”

Hickey added “The survey proved critical for helping BeyondTrust to better identify threats based upon privilege management, and also helped us evolve our product offerings to make privilege management a much easier process for security professionals.”

Hickey’s statements were validated by the launch of some new product offerings, which are aimed at bringing privilege management ease to those charged with IT security. The two new offerings are the BeyondTrust Managed Service Provider (MSP) Program and an Amazon Machine Image (AMI) of BeyondInsight available on the Amazon Marketplace. Those products are geared to prevent breaches that involve privileged credentials with deployments that include on premise solutions, virtual device solutions, as well as in the Cloud or from a Managed Services Provider.

Easy Way to Download

Jumat, 16 September 2016

Hyper Convergence Poses Unique Challenges for SAN Technologies fifianahutapea.blogspot.com

With the move towards hyper-convergence in full swing, many organizations are faced with the challenge of moving their massive data stores into virtualized environments.  A situation that came to the forefront of discussion at VMworld 2016, where all things related to hyper-convergence were discussed ad nauseam.

Even so, many were still left wondering if it was even possible to have traditional storage technologies, such as SAN and NAS, effectively coexist in an environment that was transitioning into a hyper-converged entity. What’s more, the uncertainties of transition, driven by potential communications problems, performance issues and incompatibilities could force wholesale, expensive upgrades to support the move to hyper-convergence. An issue many network managers and CIOs would love to avoid.

Simply put, the move towards hyper-convergence, which promises improved efficiencies and reduced operating expenses, can be derailed by the high costs of transitioning to virtualized SANs. An irony worth noting. Never the less, those challenges have not stopped VMware Virtual SAN from becoming the fastest growing hyper-converged solution with over 3,000 customers to date. That said, there is still room for improvement, such as helping VMware Virtual SAN support even more workloads, and that is exactly where vendor Primary Data comes into play.

At VMworld 2016, Primary Data announced the availability of the company’s DataSphere platform, which brings a storage agnostic platform to virtualized environments. In other words, Primary Data is able to tear down storage silos, without actually disrupting the configuration of those silos. It accomplishes that by creating a virtualization platform that is able to mask the individual storage silos and present them as a unified, tiered storage lake, which is driven by policies and offers almost infinite configuration options.

Abstracting data from storage hardware is not a new idea. However, Primary Data goes far beyond what companies such as FalconStore and StoneFly bring to the world of hyper-convergence.  For example, DataSphere offers a single plane of glass management console, which unifies the management of across the various storage tiers, regardless of the storage type. What’s more, the platform goes beyond the concept of a SLA (Service Level Agreement) and introduces a new concept, aptly abbreviate as SLO (Service Level Objective). Primary Data’s Kaycee Lai, an executive with the company, explained to GigaOM that “SLOs are business objectives for applications. They define a commitment to maintain a particular state of the service in a given period. For example, specific write IOPS, read IOPS, latency, and so forth, to maintain for each application. SLOs are measurable characteristics of the SLA.”

Lai added “DataSphere will support DAS, NAS, and Object as storage types. Block level support for SAN will follow in the next release.” One of the key elements offered by the platform is the ability to work with storage tiers, without the disruption of having to rebuild storage silos. Lai added “Tiers are a logical concept in DataSphere. Tiers are simply a class of storage that is mapped to a particular SLO. The notion of having multiple tiers is not as important as having multiple objectives requiring the specific storage to meet those objectives. Customers can create as many objectives as their business requires.”

In the quest to make hyper-convergence common place, Primary Data smooths the bumpy storage path with several abilities, which the company identifies as:

  • Adapt to continually changing business objectives with intelligent data mobility.
  • Scale performance and capacity linearly and limitlessly with unique out-of-band architecture.
  • Reduce costs through increased resource utilization and simplified operations.
  • Simplify management through global and automated policies.
  • Accelerate upgrades of new solutions such as VMware vSphere 6 with seamless migration using existing infrastructure.
  • Reduce application downtime with automated non-disruptive movement of data.
  • Deliver a full range of data services across all applications in the data center.

 

 

Easy Way to Download

Selasa, 13 September 2016

wpe test post fifianahutapea.blogspot.com

wp test post

Easy Way to Download

Research Proves that a Customer Centric Approach Can Bring Unforeseen Value fifianahutapea.blogspot.com

Service management vendor, Servicenow recently commissioned Intergram Research to conduct a survey, which dispels some of the common myths around service enablement, a realization that Servicenow has long prophesied about. In an interview with GigaOM, Holly Simmons, Sr. Director, Global Product Marketing, Customer Service Management, said “the survey found that the companies that excel at customer service are 127% more likely to enable their customer service agents to enlist the help of different parts of the organization in real-time.”

Or more simply put, by transforming customer service into a team sport, organization can better meet the needs of their customers, in a much shorter time frame. However, that transformation requires more than just basic intention, it requires a platform that can tear down the silos that surround people and systems, which will ultimately deliver the ability to share resolutions and improve customer services across the whole services spectrum.

That ideology is backed by the findings of Intergram Research, which surveyed senior managers in customer service roles at 200 U.S. enterprises with at least 500 employees.

The Survey Results:

The survey revealed three characteristics that separate the companies with the very best customer service from those that struggle. Companies identified as top-tier are:

  • More collaborative. They are more likely to have enabled their customer service agents to engage the help of different parts of the organization when addressing a customer’s problem.
  • Better problem-solvers. Customer service leaders are also more likely to be able to resolve the root cause of a customer’s problem (a crucial component of closing the resolution gap).
  • Self-service providers. And finally, these top-tier organizations are more likely to offer self-service options for common requests, freeing them up to focus on more strategic issues.

While for some, the above may amount too little more than just common sense, the fact of the matter is that many organizations have created silos around their various customer service elements, which hampers collaboration and adds to the time it takes to solve a customer’s problems. What’s more, those silos add hidden expenses to already overtaxed support resources, meaning that the collective knowledge of customer support must be relearned during most any new interaction.

It is those inefficiencies that lead to customers fleeing from specific vendors, especially in the realm of IT. If a customer or client cannot get a quick resolution to a problem, then they may take their business else ware.

Simmons adds “Resolving a customer’s issue quickly and effectively requires real-time collaboration, coordination, and accountability among customer service, engineering, operations, field services and other departments. But that’s just not happening at more than half of the companies surveyed. Customer service still sits on an island without a bridge to other departments, partners, and customers. That slows the resolution process, and frustrates both customers and the agents trying to help them.”

The survey also illustrated the primary problems facing organizations seeking to improve customer service include the difficulty in connecting all service processes, further hampered by service departments being siloed, along with a lack of automation. Those three factors impacted more than 50% of those surveyed, and when viewed as single issues, proved to be a primary barrier to successfully customer service transformation.

Call to Action:

While the survey highlights the both the problems and solutions surrounding agile customer service, transformation can only take place if certain ideologies are upheld. According to Servicenow, organizations that treat customer service as a “team sport” and engage the right people from relevant departments to solve problems are in a better position to proactively address the underlying reasons for customer calls. They also empower their customers to quickly answer their own questions–through self-service portals, knowledge bases, and communities–further reducing the need to interact with customer service agents. The more sophisticated customer service organizations aspire to the ideal of “no-service” by combining these practices to help eliminate the reasons for customer calls in the first place.

 

Easy Way to Download

Announcing the Full Keynote Panelist Lineup at Gigaom Change fifianahutapea.blogspot.com

Gigaom Change 2016 Leader’s Summit is just one week away, September 21-23 in Austin. The event will take place over two and a half days of keynote panels with a lineup of speakers that are visionaries making R&D and proof of concept strategic investments to bring concept to reality, forging multi-billion dollar companies along the way.

Three top industry experts in the following industries will highlight the current impact these innovations are having, then pivot toward what will be possible in the future: Robotics, AI, AR/VR/MR, Human-Machine Interface, Cybersecurity, Nanotechnology and 3D+ Printing.

Keynote panelists include leading theorists and visionaries like Robert Metcalfe, Professor of Innovation, Murchison Fellow of Free Enterprise at the University of Texas; Rob High, IBM Fellow, Vice President and CTO, IBM Watson. It also includes practitioners who are actively implementing these technologies within companies; like Shane Wall, CTO and Global Head HP Labs; Melonee Wise, CEO Fetch Robotics; Stan Deans, President of UPS Global Logistics and Distribution; and Rohit Prasad, Vice President and Head Scientist, Amazon Alexa. We will hear from Sapient about AI, IBM about nanotech, Softbank about robots and a wide range of other innovators creating solutions for visionary enterprises.

We couldn’t be more excited to introduce you to the full lineup of this extraordinary group.

Robert MetcalfeOur opening night keynote speaker will be internet/ethernet pioneer Robert Metcalfe, Professor of Innovation, Murchison Fellow of Free Enterprise at The University of Texas.
Jacquelyn Ford Morie Ph.D.Speaking on the VR/AR/MR panel is Jacquelyn Ford Morie Ph.D., Founder and CEO of All These Worlds LLC and Founder & CTO of The Augmented Traveler Corp. Dr. Jacquelyn Ford Morie is widely known for using technology such as Virtual Reality to deliver meaningful experiences that enrich people’s lives.
Rodolphe GelinDiscussing the subject of robotics is Rodolphe Gelin, EVP Chief Scientific Officer, SoftBank Robotics. Gelin has worked for decades in the field of robotics, focusing primarily on developing mobile robots for service applications to aid the disabled and elderly. He heads the Romeo2 project to create a humanoid personal assistant and companion robot.
Manoj SaxenaOn the artificial intelligence panel, Manoj Saxena, Executive Chairman of CognitiveScale and a founding managing director of The Entrepreneurs’ Fund IV, a $100m seed fund, will address the cognitive computing space.
Dr. Heike RielSpeaking on the subject of nanotechnology is Dr. Heike Riel, IBM Fellow & Director Physical Sciences Department, IBM Research. Dr. Riel’s work focuses on advancing the frontiers of information technology through the physical sciences.
Mark RolstonAddressing human-machine interface is Mark Rolston, Cofounder & Chief Creative Officer, argodesign. Mark Rolston is a renowned designer who focuses on groundbreaking user experiences and addresses the modern challenge of design beyond the visible artifact – in the realm of behavior, the interaction between human and machine, and other unseen elements.
Rob HighDiscussing the subject of artificial intelligence is Rob High, IBM Fellow, Vice President and Chief Technology Officer of IBM Watson. Rob High has overall responsibility to drive Watson technical strategy and thought leadership.
Dr. Michael EdlemanAddressing nanotechnology is Dr. Michael Edelman, Chief Executive Officer of Nanoco. Through his work with Nanoco, Dr. Edelman and his team have developed an innovative technology platform using quantum dots that are set to transform lighting, bio-imaging, and much more.
Melonee WiseAs CEO of Fetch Robotics — delivering advanced robots for the logistics industry — Melonee Wise will speak to the state of robotics today and the need and potential for the entire industry to transform to meet demand for faster, more personalized logisitics/ops delivery using “collaborative robotics”.
Shane WallAs Chief Technology Officer and Global Head of HP Labs, Shane Wall drives the company’s technology vision and strategy, new business incubation and the overall technical and innovation community. Joining our 3D+ Printing panel, Wall will provide real insights into how 3D+ printing is going to transform and disrupt manufacturing, supply chains, even whole economies.
David RoseTaking a place on the Human-Machine interface panel is David Rose, an award-winning entrepreneur, author, and instructor at the MIT Media Lab. His research focuses on making the physical environment an interface to digital information.
Stan DeansJoining the 3D+ Printing panel is Stan Deans, President of UPS Global Logistics and Distribution. Deans has been instrumental in building UPS’s relationship with Fast Radius by implementing its On Demand Production Platform™ and 3D Printing factory in UPS’s Louisville-based logistics campus. By building this disruptive technology into its supply chain models, UPS is now able to bring new value to manufacturing customers of all sizes.
Rohit PrasadAddressing human-machine interface is Rohit Prasad, Vice President and Head Scientist, Amazon Alexa, where he leads research and development in speech recognition, natural language understanding, and machine learning technologies to enhance customer interactions with Amazon’s products and services.
Liam QuinnJoining our AR/VR/MR panel, Liam Quinn is VP, Senior Fellow & CTO for Dell, responsible for leading the development of the overall technology strategy. Key passions are xReality where Quinn drives the development and integration of specific applications across AR & VR experiences, as well as remote maintenance, gaming and 3D applications.
Niloofar RaziNiloofar Razi is SVP & Worldwide Chief Strategy Officer for RSA. As part of the Cybersecurity panel she brings more than 25 years experience in the technology and national security sectors, leading corporate development and implementation of investment strategies for billion dollar industries.
Michael PetchMichael Petch is a renowned author & analyst whose expertise in 3D+ printing will bring deep insights to advanced, additive manufacturing technologies on our Nanotechnology panel. He is a frequent keynote speaker on the economic and social implications of frontier technologies.
Josh SuttonJosh Sutton is Global Head, Data & Artificial Intelligence for Publicis.Sapient. As part of the AI panel Josh will discuss how to leverage established and emerging artificial intelligence platforms to generate business insights, drive customer engagement, and accelerate business processes via advanced technologies.
Melissa MormanJoining our AR/VR/MR panel is Melissa Morman, Client Experience Officer, BuilderHomesite Inc. Morman is a member of the original founding executive team of BHI/BDX (Builders Digital Experience) and advises top executives in homebuilding, real estate, and building products industries on the digital transformation of their business.
John McClurgJoining our Cybersecurity panel is John McClurg, VP & Ambassador-At-Large, Cylance. McClurg was recently voted one of America’s 25 most influential security professionals, sits on the FBI’s Domestic Security (DSAC) & National Security Business Alliance Councils (NSBAC), and served as the founding Chairman of the International Security Foundation.
Mark HatfieldSpeaking on our Cybersecurity panel is Mark Hatfield, Founder and General Partner of Ten Eleven Ventures, the industry’s first venture capital fund that is focused solely on investing in digital security.
Mark HalversonSpeaking on our robotics panel is Mark Halverson, CEO of Precision Autonomy whose mission is to make unmanned and autonomous vehicles a safe reality. Precision Autonomy operates at the intersection of Artificial Intelligence and Robotics employing crowdsourcing and 3 dimensional augmented reality to allow UAVs and other unmanned vehicles to operate more autonomously.
James V HartSpecial guest James V Hart, is an award-winning and world-renowned Hollywood screenwriter whose film credits include Contact, Hook, Bram Stoker’s Dracula, Lara Croft: Tombraider, August Rush, Epic and many more projects in various stages of development, including Kurt Vonnegut’s AI fueled story Player Piano. With us he’ll discuss the impact of storytelling on how we’ve formed our views of the future.

Gigaom Change 2016 Leader’s Summit is just one week away, September 21-23 in Austin, but there are still a few tickets available for purchase. Reserve your seat today.

Easy Way to Download

Senin, 12 September 2016

Fluke briefing report: Closing the gap between things and reality fifianahutapea.blogspot.com

The Internet of things is great, right? I refer the reader to the vast amount of positive literature that is washing through the blogosphere, no doubt being added to even as I write this. At the same time, plenty of people are pointing out the downsides — data security for example, more general surveillance issues or indeed the potential for any ‘smart’ object to be hacked.

All well and good, in other words it’s a typical day in techno-paradise. But the conversation itself is skewed towards the ability to smarten up — that is, deliver new generations of devices that have wireless sensors built in. What of the other objects that make up 98% (I estimate) of the world that we live in?

Enter companies such as Fluke, which earned its stripes over many years of delivering measurement kit to engineers and technicians, from multimeters to higher-end stuff such as thermal imaging and vibration testing. While such companies might not have a high profile outside of operational circles, they are recognising the rising tide of connectedness and doing something about it in their own domains.

In Fluke’s case, this means manufacturing plants, construction sites and other places where the term ‘rugged’ is a need to have, not a nice to have. Such sites have plenty of equipment that can’t simply be replaced with a smarter version, but which nonetheless can benefit substantially from remote measurement and management.

The current consequence, Fluke told me in a recent briefing about their let’s connect-the-world platform (snappily titled the “3500 FC Series Condition Monitoring System”), is that failures are captured after the event. “We have more than 100,000 pieces of equipment and the reliability team can only assess so many. We’ve never been able to have maintenance techs collect data for us, until now,” reports a maintenance supervisor at one US car manufacturer.

That Fluke are upbeat about the market opportunity nearly goes without saying — after all, there really is a vast pool of equipment that can seriously benefit from being joined up — but the point is, the model goes as wide as there are physical objects to manage. And equally there’s a ton of companies like Fluke that are smartening up their own domains, making a splash in their own jurisdictions. Zebra’s smart wine rack may just have been a proof of concept, but give it five years and all wine lovers will have one.

Inevitably, there will be a moment of shared epiphany when all such platforms start integrating together, coupled with some kind of Highlander-like fight as IoT integration and management platforms look to knock the rest out of the market. I’m reminded of the moment, back in the early 90’s, when telecoms manufacturers adopted the HP OpenView platform en masse, leading to possibly the dullest Interop Expo on record.

Yes, the future will be boring, as we default to using stuff that we can remotely monitor and control. As consumers we may still like using ‘dumb stuff’ but for businesses that interface with the physical world, to do so would make no commercial sense. Equally however, such a dull truth will provide a platform for new kinds of innovation.

I could postulate what these might be but the Law of Unexpected Consequences has the advantage. All I do know is, it won’t be long at all before what is seen as exceptional — the ability to monitor just about everything — will be accepted as the norm. At that point, and to make better use of one of Apple’s catchphrases, everything really will be different.

Easy Way to Download

Rabu, 07 September 2016

Welcome to the Post-Email Enterprise: what Skype Teams means in a Slack-centered World fifianahutapea.blogspot.com

Work technology vendors very commonly — for decades — have suggested that their shiny brand-new tools will deliver us from the tyranny of email. Today, we hear it from all sorts of tool vendors:

  • work management tools, like Asana, Wrike, and Trello, built on the bones of task manager with a layer of social communications grafted on top
  • work media tools, like Yammer, Jive, and the as-yet-unreleased Facebook for Work, build on social networking model, to move communications out of email, they say
  • and most prominently, the newest wave of upstarts, the work chat cadre have arrived, led by Atlassian’s Hipchat, but most prominently by the mega-unicorn Slack, a company which has such a strong gravitational field that it seems to have sucked the entire work technology ecosystem into the black hole around its disarmingly simple model of chat rooms and flexible integration.

Has the millennium finally come? Will this newest paradigm for workgroup communications unseat email, the apparently undisruptable but deeply unlovable technology at the foundation of much enterprise and consumer communication?

Well, a new announcement hit my radar screen today, and I think that we may be at a turning point. In the words of Winston Churchill, in November 1942 after the Second Battle of El Alamein, when it seemed clear that the WWII allies would push Germany from North Africa,

Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning.

And what is this news that suggests to me we may be on the downslope in the century-long reign of email?

Microsoft is apparently working on a response to Slack, six months after the widely reported termination of discussions of acquisition. There has been a great deal of speculation about Microsoft’s efforts in this area, especially considering the now-almost-forgotten acquisition of Yammer (see Why Yammer Deal Makes Sense, and it did make sense in 2012). However, after that acquisition, Microsoft — and especially Bill Gates, apparently — believed they would be better off building Slackish capabilities into an existing Microsoft brand. But, since Yammer is an unloved product inside of the company, now, the plan was to build these capabilities into something that the company has doubled down on. So now we see Slack Teams, coming soon.

Microsoft may be criticized for maybe attempting to squish too much into the Skype wrapper with Skype Teams, but we’ll have to see how it all works together. It is clear that integrated video conferencing is a key element of where work chat is headed, so Microsoft would have had to come up with that anyway. The rest of the details will have to wait for actual hands on inspection (so far, I have had only a few confidential discussions with Microsofties).

My point is that we are moving into a new territory, a time where work chat tools will become the super dominant workgroup communications platform of the next few decades. This means that the barriers to widespread adoption will have to be resolved, most notably, work chat interoperability.

Most folks don’t know the history of email well enough to recall that at one time email products did not interconnect: my company email could not send an email to your company email. However, the rise of the internet and creation of international email protocols led to a rapid transition, so that we could stop using Compuserve and AOL to communicate outside the company.

It was that interoperability that led to email’s dominance in work communications, and similarly, it will take interoperability of work chat to displace it.

In this way, in the not-too-distant future, my company could be using Slack while yours might be using Skype Teams. I could invite you and your team to coordinate work in a chat channel I’ve set up, and you would be able to interact with me and mine.

If the world of work technology is to avoid a collapse into a all-encompassing monopoly with Slack at the center of it, we have to imagine interoperability will emerge relatively quickly. Today’s crude integrations — where Zapier or IFTTT copy new posts in Hipchat to a corresponding channel in Slack — will quickly be replaced by protocols that all competitive solutions will offer.

We’ll have to see the specifics of Skype Teams, and where Facebook at Work is headed. Likewise, all internet giants — including Apple, Google, and Amazon — seem to be quietly consolidating their market advantages in file sync-and-share, cloud computing, social networks, and mobile devices. Will we see a Twitter for Work, for example, after an Amazon acquisition? Surely Google Inbox and Google+ aren’t the last work technologies that Alphabet intends for us?

But no matter the specifics, we are certainly on the downslopes of the supremacy of email. We may have to wait an additional 50 years for its last gasping breath, but we’re now clearly in the chat (and work chat) era of human communications, and there’s no turning back.

Easy Way to Download

Selasa, 06 September 2016

Is There Life After Dell? SonicWALL Thinks So! fifianahutapea.blogspot.com

When SonicWALL was acquired by Dell back in 2012, many wondered how SonicWALL would fare under the auspices of industry giant Dell. That said, SonicWALL managed to maintain market share in its core SMB business sector, and start making inroads in to the large, distributed enterprise sector. Nonetheless, when Dell decided to sell off its software assets, along with SonicWALL to private equity firms, many began to wonder once again what that meant for SonicWALL.

SonicWALL provided the answers to those queries at the company’s PEAK 2016 event, which was held last week in Las Vegas. The primary topics of discussion focused on applying SonicWALL technology and what the future holds for SonicWall, its partners and customers.

Along with the requisite product announcements, SonicWALL also hosted several educational sessions bringing cloud security to the forefront of partners’ minds, as well as the challenges created by the ever growing IoT infrastructure spreading through enterprises today.

SonicWALL offered a strong message that there is life after Dell, and that the company will thrive and grow despite the forced separation from Dell. For example, SonicWALL is in the process of strengthening the company’s channel programs to better support both its partners and end customers. What’s more, the company also announced its Cloud GMS offering, which is aimed at simplifying management, enhancing reporting, and reducing overhead. What’s more, Cloud GMS brings cloud based management, patching and updating to the company’s army of partners, providing them with a critical weapon in the battle against hosted security vendors, and those plying “firewalls in the cloud” as a means to an end.

The importance of the forthcoming Cloud Global Management System (GMS) cannot be understated. SonicWALL aims to eliminate the financial, technical support and system maintenance hurdles that are normally associated with traditional firewalls, transforming what was once an isolated security solution into a cloud managed security platform. A capability that will prove important to both customers and partners.

For partners, Cloud GMS brings a unique, comprehensive, low cost monthly subscription to the table, which is prices out based upon the number of firewalls under management. That ideology will allow partners to become something akin to a hosted services security provider, shifting customer expenses to OpEx, instead of CapEx.

SonicWALL Cloud GMS solution Offers:

  • Governance: Establishes a cohesive approach to security management, reporting and analytics to simplify and unify network security defense programs through automated and correlated workflows to form a fully coordinated security governance, compliance and risk management strategy.
  • Compliance: Rapidly responds and fulfills specific compliance regulations for regulatory bodies and auditors with automatic PCI, HIPAA and SOX reports, customized by any combination of auditable data.
  • Risk Management: Provides ability to move fast and drive collaboration and communication across shared security framework, making quick security policy decisions based on time-critical and consolidated information for higher level security efficacy.
  • Firewall management: MSPs will be able to leverage efficient, centralized management of firewall security policies similar to on-premises GMS features, including customer sub-account creation and increased control of user type and access privilege settings.
  • Firewall reporting: Real-time and historical, per firewall, and aggregated reporting of firewall security, data and user events will give MSPs greater visibility, control and governance while maintaining the privacy and confidentiality of customer data.
  • Licensing management: Seamless integration between GMS and MySonicWALL interfaces will allow users to easily and simply log into Hosted GMS to organize user group names and memberships, device group names and memberships, as well as adding and renewing subscriptions and support.

 

Easy Way to Download