国产av日韩一区二区三区精品,成人性爱视频在线观看,国产,欧美,日韩,一区,www.成色av久久成人,2222eeee成人天堂

Home Database Mysql Tutorial Business Intelligence 2.0: Simpler, More Accessible, Inevita

Business Intelligence 2.0: Simpler, More Accessible, Inevita

Jun 07, 2016 pm 03:06 PM
business simp

Say goodbye to complicated interfaces, disconnected analytics and shelfware. An emerging era for BI will bring simplicity, broad access and better ties between analysis and action. By Neil Raden 1, 2007 Why do three billion people a day us

Say goodbye to complicated interfaces, disconnected analytics and shelfware. An emerging era for BI will bring simplicity, broad access and better ties between analysis and action. By Neil Raden 1, 2007 Why do three billion people a day use a Web site that has a Spartan interface, a weird name (Google) and the capacity to do only one thing (the original search engine, that is, not the newer applications)? Perhaps the reason is that it always gives you more than you asked for, but what you want is usually on the first page. No one needs to install software, to do upgrades or, for that matter, to pay for it. And no one needs a training class. Contrast this with Business Intelligence (BI) tools today, with complicated interfaces, software that is very expensive to purchase and maintain, and version upgrades that are often painful. These tools generally offer very broad functionality, but somehow it's never quite enough to solve the problem. There are, at best, a few million copies licensed worldwide (across all BI vendors), yet more than a third of these seats are shelfware, according Optima Publishing's OLAP Survey 4. One could argue that phenomenon of the Web is so unique that it is unfair to measure BI's performance against it. The problem is that this is the second time BI has been eclipsed by another technology, as proven by the fact that there are more than 200 million licensed copies of Microsoft Excel in use today. Rest assured, the current era of BI is coming to an end and will be succeeded by a BI 2.0 era that promises simplicity, universal access, real-time insight, collaboration, operational intelligence, connected services and a level of information abstraction that supports far greater agility and speed of analysis. The motivation for this \"version upgrade\" for BI is the need to move analytical intelligence into operations and to shrink the gap between analysis and action. BI 1.0 Defined Despite its drawbacks, BI is not a failure. In most ways, the facility for getting information to people where and when they need it is dramatically better than it was more than a decade ago, when reports were usually paper-based and required months of development to line up the data. But if you boil BI down to its basics, it is derived from only two things: data and reports. Most of the effort in BI in the last decade has been focused on the data issue \" data integration, data quality, data cleansing, data warehouse, data mart, data modeling, data governance, data stewardship. BI tools are dependent on these efforts. Existing BI solutions are designed primarily for people who can understand the data models and who have time to build analyses from them, recall them for future use and provide information for others. Within most organizations, these experts account for about 5 percent of the salaried workforce. That was fine for a decade or so while we worked out the problems and fine-tuned the architectures and methodologies, but the people who are supposed to be served by this effort have largely voted with their feet and marched back to shadow systems, particularly spreadsheets, where the bulk of analytical work still takes place. While the number of BI users is increasing, the bulk of the increase is in passive report distribution, not active analysis, collaboration or decision-making. Along with data, reporting drives BI. What is a report? At the simplest level, a report is the rendering of information requested from existing data, with at least some level of formatting and usually some added calculations, such as subtotals and totals at a minimum. Reports are read-only. Originally, reports were distributed on either paper or microfiche, but about 25 years ago it became possible to see them electronically. Today, they can be delivered to the Web, to portals, to cell phones, PDAs -- just about anywhere. Making reports \"interactive\" doesn't really change their nature. The ability to select parameters, for instance, is actually a reporting application surrounding just another report. The ability to navigate through the elements of the data model interactively, such as OLAP, ultimately ends up as the visual display of just another report. No matter how many ways the fetching and formatting of data is embellished, it is still a report. Some OLAP tools are not read-only, but their use represents a tiny fraction of the BI use which, in turn, is an almost imperceptible fraction of the daily use of alternatives, especially spreadsheets. BI 2.0: What to Expect The new era of BI, which is already here, goes far beyond data and reporting. BI is becoming proactive, real-time, operational, integrated with business processes, and extending beyond the boundaries of the organization. To become pervasive and grow out of its reporting niche, BI has to provide simple, personal analytical tools on an as-needed basis with a minimal footprint and cost. Rather than relying solely on a rigid metaphor like data warehousing, BI needs the ability to access data anywhere it can be found and to performing integration on the fly, if necessary. Locating the right information to solve problems must be a semantic process, not requiring knowledge of data structures or canonical forms. Directed search, based on the meanings of and relationships among objects, allows practically any person or service to find what is needed without assistance, whether it is structured information like relational databases, message queues, logs, web services or even unstructured content. The Web was not just a brilliant idea, it was and is a world-changing phenomenon. Yet even in light of this success, it is quickly moving to an entirely new plane of existence, one that is more collaborative, intelligent, rich, semantically aware and simple. Web 2.0 is about closing the loop, from being a collection of web pages to a being the single platform for just about everything. The physical resources are in place and attractively priced, the open standards are evolving without fracturing the confederacy of technology players, large and small. The missing piece is the semantic technology, but even that is progressing nicely as semantics pure plays are gobbled up by more well-heeled operations like IBM, Google and webMethods. BI needs a 2.0 initiative too. BI 2.0 won't replace BI, it will extend it. Current BI capabilities are well-suited to many tasks, such as individualized dashboards of overnight data or providing a rich environment for interactive exploring of data, by people and by algorithms. But current BI practices have always been weak dealing with the unexpected and the urgent because of their \"data first\" orientation. Spreadsheets shine here, but they lack too many architectural prerequisites to be useful, such as collaboration, security, abstraction and versioning, among others. If spreadsheet tools develop these characteristics, and they are heading in that direction, they will not usurp BI, they will become BI. Game Changers BI 2.0 will disrupt the BI industry in quite a few ways, and foot-dragging can be expected. Some more obvious effects will be:

  • Convergence: The unnatural separation of BI technology and staff from the rest of the operations of the enterprise will fade as BI becomes ubiquitous and mission-critical.
  • Batch Data Warehousing: This activity will continue, but at a reduced rate as the need for information exchange with partners, customers, regulators and other stakeholders become a 24/7 proposition. Faster computers, more memory and better metadata (semantic models) will support better data comprehension and reduce the need for data integration.
  • Methodologies: Data warehouse methodologies that have been laid down and tuned for the past decade will take a while to pry loose from the collective consciousness. Many best practices will be largely invalidated by BI 2.0 and new ones will emerge, but conventional wisdom is tenacious. Expect a lot of friction here. Remember 3NF versus Star Schema MOLAP versus ROLAP?
  • The \"Pyramid\": This model of BI usage needs to be scrapped. The idea of separating people into bands of a pyramid based in whether they are power users or report readers never worked in the first place. Even \"unsophisticated\" users had a need to do a little modeling once a while, that's why they have Excel. In BI 2.0, roles are infinitely malleable and a person can operate in different roles simultaneously.
  • BI Licensing: It's very likely that BI software sales and licensing will evolve, too. If organizations can transform from 100 BI users to 10,000 in a month, the market will not allow the incumbents to reap a 100x windfall in revenue. Open-source and On-Demand channels will put more pressure on the traditional vendors to retire expensive, per-seat perpetual licenses.
  • Data Comprehension: Data integration is a painstaking process that is sifted through by people, and then the process is automated with a tool. Tools are emerging that can take over the tedious integration work, at least to some extent, and the rise of Master Data Management (MDM) hubs can further reduce the time-consuming work load. Coupled with better semantics-based metadata, on-the-fly data comprehension solutions -- for both internal data and for connected flows of data from partners \" will be emerging within the next 12-18 months.
Close the Loop Today, analytics is a singular process. An individual views a report in some form or creates a query to understand something. No matter how sophisticated or nave, the result is always the same (assuming it works); the analyst has informed his or her opinion or hypothesis, but the analytical tool stops there. It isn't possible to replay the steps and show others how the question was resolved. Nor is it possible, without a custom-made application, to transfer this new knowledge to a system or service that can act on it immediately. Nor is it possible -- again, without building an application -- to track the results of the decision explicitly (as in, \"Orville, the action you took on pricing based on the new parameters that entered the system on Tuesday are showing a marked improvement in on-time arrivals at the terminal.\"). Delivering such capabilities would bring a Nirvana of decision support, something we have all been envisioning for a decade or more, but BI in its current state is not designed to close the loop. That's the promise of BI 2.0, and, in fact, it's the single driving reason for this new era to emerge. Try New Thinking Learning new skills is important and it's easy. What's difficult is wrapping your brain around all of this and getting ready to walk away from what you now accept as the correct approach. For instance, here are six BI 1.0 fallacies that fall by the wayside in the BI 2.0 era:
BI 1.0 Fallacies BI 2.0 Realities
Most users want to be spoon-fed information and will never take the initiative to create their own environment or investigate the best way to get the answers they need. The Consumer Web invalidates this idea. When given simple tools to do something that is important and/or useful to them, people find a way to \"mash up\" what they need.
Vendors will obfuscate and slow down the drive for simpler and more affordable tools to preserve their bases They will, but demographics will pressure them. Most BI \"users\" will be members of a generation that lives in technology and will reject the functionality of current BI
Only air traffic controllers and credit card approval applications need real-time data The availability of fresh data, from ever-widening sources, generates its own demand
Analytics cannot be supported until there is an enterprise data warehouse, with a metadata repository, data stewards and a comprehensive data model that represents the \"single version of the truth.\" Data comprehension will displace data warehousing, to some extent. The single version of the truth will give way to context, contingency and the need to relate information quickly from many sources
Operational systems cannot be queried for analytics There is no longer a good reason for this prohibition. In fact, with SOA, it doesn't even make sense.
Data must exist in a persistent data store for analytics Message queues, logs, sensors \" transient data and caches, temporary aggregates, lingering partial results files \" all of these can be leveraged now with the resources at hand.
Prepare for the Inevitable BI 2.0 is a natural evolution, but it isn't happening in a smooth, continuous fashion. It's more tectonic; forces build and changes happen abruptly. Enterprise technologies tend to be sticky because of their cost -- not just to implement but also to replace. However, all enterprise applications are following the lead of the Web, especially Web 2.0. With billions of daily users, the Web is the cohort for a massive longitudinal study. The results are in. People love collaboration, social networks, mash-up's, no-cost software, versionless software and the ability to get what they need without a lot of ceremony. BI 2.0 is inevitable. What should you do to prepare? Mostly, readjust your thinking so you are open to the best offers. Here are six ideas that will give you to start:
  • 1. Recognize the situation \" People and technology have shifted to a new plane
  • 2. Rethink analytics \" Informing people to make better decisions is out; changing the nature of work is in
  • 3. Think out of the BI box \" Is compartmentalized security logical? Is role-based BI provisioning useful?
  • 4. Shift your focus from data to people \" Pretend data is like water. Now what do you do?
  • 5. Think less about process and more about experience \" Reactions to internal software are now flavored by the Web.
  • 6. Think less about features and more about how people work effectively \" MOLAP vs. ROLAP is history; social networking and collaboration are the future.
BI is not a failure and it's not dead, either. BI 1.0 provided a great deal of utility, but the next wave will be hoisted up, as Issac Newton described his superhuman accomplishments, like \"standing on the shoulders of giants.\" So climb on. The view is great. Neil RadenNeil Raden is the founder of Hired Brains, providers of consulting, research and analysis in Business Intelligence, Performance Management, real-time analytics and information/semantic integration. Write him at nraden@hiredbrains.com.
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undress AI Tool

Undress AI Tool

Undress images for free

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

What is GTID (Global Transaction Identifier) and what are its advantages? What is GTID (Global Transaction Identifier) and what are its advantages? Jun 19, 2025 am 01:03 AM

GTID (Global Transaction Identifier) ??solves the complexity of replication and failover in MySQL databases by assigning a unique identity to each transaction. 1. It simplifies replication management, automatically handles log files and locations, allowing slave servers to request transactions based on the last executed GTID. 2. Ensure consistency across servers, ensure that each transaction is applied only once on each server, and avoid data inconsistency. 3. Improve troubleshooting efficiency. GTID includes server UUID and serial number, which is convenient for tracking transaction flow and accurately locate problems. These three core advantages make MySQL replication more robust and easy to manage, significantly improving system reliability and data integrity.

What is a typical process for MySQL master failover? What is a typical process for MySQL master failover? Jun 19, 2025 am 01:06 AM

MySQL main library failover mainly includes four steps. 1. Fault detection: Regularly check the main library process, connection status and simple query to determine whether it is downtime, set up a retry mechanism to avoid misjudgment, and can use tools such as MHA, Orchestrator or Keepalived to assist in detection; 2. Select the new main library: select the most suitable slave library to replace it according to the data synchronization progress (Seconds_Behind_Master), binlog data integrity, network delay and load conditions, and perform data compensation or manual intervention if necessary; 3. Switch topology: Point other slave libraries to the new master library, execute RESETMASTER or enable GTID, update the VIP, DNS or proxy configuration to

How to connect to a MySQL database using the command line? How to connect to a MySQL database using the command line? Jun 19, 2025 am 01:05 AM

The steps to connect to the MySQL database are as follows: 1. Use the basic command format mysql-u username-p-h host address to connect, enter the username and password to log in; 2. If you need to directly enter the specified database, you can add the database name after the command, such as mysql-uroot-pmyproject; 3. If the port is not the default 3306, you need to add the -P parameter to specify the port number, such as mysql-uroot-p-h192.168.1.100-P3307; In addition, if you encounter a password error, you can re-enter it. If the connection fails, check the network, firewall or permission settings. If the client is missing, you can install mysql-client on Linux through the package manager. Master these commands

Why is InnoDB the recommended storage engine now? Why is InnoDB the recommended storage engine now? Jun 17, 2025 am 09:18 AM

InnoDB is MySQL's default storage engine because it outperforms other engines such as MyISAM in terms of reliability, concurrency performance and crash recovery. 1. It supports transaction processing, follows ACID principles, ensures data integrity, and is suitable for key data scenarios such as financial records or user accounts; 2. It adopts row-level locks instead of table-level locks to improve performance and throughput in high concurrent write environments; 3. It has a crash recovery mechanism and automatic repair function, and supports foreign key constraints to ensure data consistency and reference integrity, and prevent isolated records and data inconsistencies.

How to add the MySQL bin directory to the system PATH How to add the MySQL bin directory to the system PATH Jul 01, 2025 am 01:39 AM

To add MySQL's bin directory to the system PATH, it needs to be configured according to the different operating systems. 1. Windows system: Find the bin folder in the MySQL installation directory (the default path is usually C:\ProgramFiles\MySQL\MySQLServerX.X\bin), right-click "This Computer" → "Properties" → "Advanced System Settings" → "Environment Variables", select Path in "System Variables" and edit it, add the MySQLbin path, save it and restart the command prompt and enter mysql--version verification; 2.macOS and Linux systems: Bash users edit ~/.bashrc or ~/.bash_

What are the transaction isolation levels in MySQL, and which is the default? What are the transaction isolation levels in MySQL, and which is the default? Jun 23, 2025 pm 03:05 PM

MySQL's default transaction isolation level is RepeatableRead, which prevents dirty reads and non-repeatable reads through MVCC and gap locks, and avoids phantom reading in most cases; other major levels include read uncommitted (ReadUncommitted), allowing dirty reads but the fastest performance, 1. Read Committed (ReadCommitted) ensures that the submitted data is read but may encounter non-repeatable reads and phantom readings, 2. RepeatableRead default level ensures that multiple reads within the transaction are consistent, 3. Serialization (Serializable) the highest level, prevents other transactions from modifying data through locks, ensuring data integrity but sacrificing performance;

What are the ACID properties of a MySQL transaction? What are the ACID properties of a MySQL transaction? Jun 20, 2025 am 01:06 AM

MySQL transactions follow ACID characteristics to ensure the reliability and consistency of database transactions. First, atomicity ensures that transactions are executed as an indivisible whole, either all succeed or all fail to roll back. For example, withdrawals and deposits must be completed or not occur at the same time in the transfer operation; second, consistency ensures that transactions transition the database from one valid state to another, and maintains the correct data logic through mechanisms such as constraints and triggers; third, isolation controls the visibility of multiple transactions when concurrent execution, prevents dirty reading, non-repeatable reading and fantasy reading. MySQL supports ReadUncommitted and ReadCommi.

Why do indexes improve MySQL query speed? Why do indexes improve MySQL query speed? Jun 19, 2025 am 01:05 AM

IndexesinMySQLimprovequeryspeedbyenablingfasterdataretrieval.1.Theyreducedatascanned,allowingMySQLtoquicklylocaterelevantrowsinWHEREorORDERBYclauses,especiallyimportantforlargeorfrequentlyqueriedtables.2.Theyspeedupjoinsandsorting,makingJOINoperation

See all articles