SQL Server 2012
Not many databases scale from in-app to cloud, but that is what Microsoft has done with SQL Server. Developers can build it into their apps, or use SQL Azure to deliver cloud-scale data. Now it is time for the remainder of the SQL Server family to milk the teachings Microsoft learned from building the cloud-hosted, scalable SQL Azure – and to get your data ready for the Windows Server 8 private cloud.
Microsoft has reduced the variety of SQL Server editions within the 2012 release. The low-end Workgroup and Small Business Editions had been retired, together with the high-end Datacenter Edition. That leaves the Enterprise and conventional Editions, that are joined by a brand new member of the SQL Server family, the Business Intelligence Edition . Standard is designed for departmental databases, while Enterprise is for giant-scale applications and knowledge warehouses; the hot BI edition adds a variety of analytic tools – including the hot Power View self-service analysis and reporting tools – to the functionality of the usual edition.
Installation is quick and straightforward, with a fixed of wizards to assist make sure that the correct prerequisites are installed, and that your system is tuned for running databases
Performance and the personal cloud
Performance is very important for any modern database. You will need to deliver responses to queries faster than ever, and to many various applications around the organisation. SQL Server 2012 adds new index types to assist accelerate operations, especially with large databases. One approach is used for read-only workloads, like data warehouses. ColumnStore indexes add a brand new view on top of existing tables, allowing queries to fetch only the columns which might be needed in place of a whole database. It’s an approach that may deliver as much as 100x speed increases – something that’s increasingly important where data warehouses may have billions of rows and thousands of columns. ColumnStore indexes aren’t in-memory but are fetched as required, improving the performance of the in-memory operations needed for many large business intelligence systems. ColumnStore indexes are only section of a variety of technologies that Microsoft is branding xVelocity , focused around support for in-memory databases.
SQL Server 2012 also adds improved data encryption features, with end-to-end data encryption that works locally and in SQL Azure. Access control is tied to Active Directory, and may be utilized in conjunction with SharePoint to secure analytic models and knowledge. AES256 is used for keys, with SHA512 for password hashes. Integrated audit tools mean it is also easier with the intention that you’re complying with all of the relevant regulations to your data.
Microsoft is talking lots about private cloud in recent times – witness its positioning of System Center 2012 and Windows Server 8 . So it’s really no surprise to search out the corporate talking about SQL Server 2012 as another a part of its cloud strategy. There is definitely a whole lot inside the generation that builds at the cloud-hosted SQL Azure. As an instance, there’s tooling for resource pooling, for scale (both up and out), for self-service management and for utility-like operations – concentrating on support for SQL Server in virtualised environments.
Microsoft grants multiple ways of managing SQL 2012, including PowerShell; using the PowerShell SQL Server cmdlets, scripts may well be authored for your desktop after which run remotely across several servers
It’s certainly an approach that many organisations will find attractive. As SQL Server has become easier to take advantage of, there’s been an explosion of departmental databases built using it in place of Access. A version of SQL Server which will allow consolidation and control, while still supporting self-service operation, will give both IT departments and business users what they wish. To support this, Microsoft can be releasing a suite of System Center Management Packs for SQL Server 2012 (in addition to tools for building SQL Server application templates to be used with System Center 2012′s self-service tools). Other features that help to construct a scalable SQL Server environment include new Always On features, with support for Availability Groups, and big clusters that work across multiple datacentres. There’s also increased support for Microsoft’s Hyper-V virtualisation platform, with support for Live Migration and server replication. You may as well scale dynamically, with the recent Resource Governor controlling the memory and CPU utilized by the SQL Server database engine.
Microsoft could be providing documentation to assist organisations deploy their very own cloud platforms, in addition to working with HP to deliver a reference architecture for SQL Server 2012 private clouds. There’s also the choice of using preconfigured HP Database Consolidation Appliances to quickly roll out cloud-scale databases to your own datacentre. A half-rack Consolidation Appliance can host around 200 databases, with the system scaling to ten full racks. Each rack has 192 cores and 58TB of storage.
High availability and large data
Any SQL Server 2012 cloud might want to cash in on its new high-availability features. Availability Groups are easy to position together, and might be used to give protection to key databases. It is usually easier to construct clustered databases, with support for as much as four secondaries for multi-database failover. There’s also enhanced disaster recovery support, with tools for handling multi-site failover (letting you host secondaries off-site using shared storage). It’s a must to note that there are many other scenarios where these features will come into play – especially support for improved high availability. Always On databases are a key part of critical business processes, in addition to a device for supporting web services and sites. Server Core support must also increase uptime, reducing the desire for downtime on account of patching and giving servers a smaller attack surface.
Big databases mean big data, and Microsoft has used its work on SLQ Azure to enhance how SQL Server handles large-scale databases. Rather than throwing data away, it’s stored for possible future investigation using analytic tools. A part of the hot approach is integration with tools like Hadoop , using a connector to deliver data from SQL Server and receive processed results for analysis.
SQL Server 2012s’ Data Quality Services tools help ensure your data is clean, ready to be used in business intelligence applications
Business intelligence and development
We’re storing further and further data, and SQL Server 2012 is able to handle that. But we also have to get insight from that data, to assist businesses make the proper decisions. That’s some of the explanation why Microsoft has bolstered the analytic tools built into its database, with a complete new Business Intelligence Edition which may work with both structured and unstructured data.
In the past, much of Microsoft’s analytics approach have been centred on Excel and its PowerPivot modelling tools. That changes with SQL Server 2012, which introduces a more database-centric method to BI. A part of it is a new semantic model for BI data, which lets you have one model that may be used (or created) by individuals, teams or IT. SQL Server 2012′s BI semantic model handles business logic, storage and use of SQL Server’s VertiPaq in-memory engine. There’s also improved support for the now-venerable SQL Server Reporting Services, with the hot Power View visual report design tools and integration with SharePoint, in addition to the flexibility to construct alerts into BI reports and models.
Part of the SQL Server 2012 BI tooling, PowerPivot permits you to build BI reports in Excel, working with massive data sets in real time
You can use SQL Server 2012′s Business Intelligence Development Studio to create and manage semantic models for all of your business intelligence applications and reports
Data warehouses using SQL Server 2012 get a performance boost from the brand new column indexes, but that’s only component to its BI story. PowerPivot gets an upgrade, with new tools for more complex analytical models – including support for hierarchies and multiple relationships, making it easier to know how data is linked. The resulting models can now be converted into applications and embedded in dashboards using SharePoint, so insights from one a part of a business can become tools that help a complete organisation. IT departments even have greater control over how PowerPivot is used, while end users still design and build their very own analytic models. More complex models might be built using the BI Development Studio, which integrates with Visual Studio and allows you to take a project-based technique to building analytic models.
One of SQL Server 2012′s new features, Power View, lets you build interactive data visualisations to your web browser, after which share them with the remainder of a team or the entire organisation
Power View is Microsoft’s response to tools like Tableau , specializing in data visualisation and built using an internet browser as opposed to as a separate application. You begin by utilizing PowerPivot to build a knowledge mashup from various services, after which use Power View to create interactive visualisations in keeping with your data. There are many viewing options in Power View, but it’s missing some key visualisations, including mapping. Once you’ve created a visualisation, you need to use a SharePoint library to share your Power Views.
SQL Server’s Management Studio may be used inside and out Visual Studio, to preserve, design and configure databases
Like earlier versions, SQL Server 2012 is being released with a set of development tools – something that’s becoming increasingly important as developers ought to scale databases from mobile and embedded applications, to departmental, enterprise and directly to the cloud. SQL Server 2012 adds new design-time tools with validation and verification, and tools for developing locally before publishing to live databases; application and knowledge development are linked into one environment, using Visual Studio as a typical IDE. The entire range of SQL Server Development Tools cover everything from table design to code analysis and SQL debugging – and, needless to say, to database publishing. The identical tools target on-premises SQL Server instances and cloud-based SQL Azure, and an area runtime means you should utilize familiar Visual Studio debugging features with both queries and stored procedures.
If you’re using Visual Studio 2010, SQL Server 2012 will add new project options – providing you with one place to construct database applications, from data model to code
One of the most important changes to SQL Server 2012 isn’t to any of its underlying technologies – it’s to how Microsoft might be 0 licensing 0 it. These changes are partly a response to the best way databases are being deployed – especially in regards to virtualisation – and to how they’re getting used , with more give attention to business intelligence and no more on pure data storage. They’re also a response to the style silicon architectures have changed, and to the increasing selection of cores and threads in a fashionable server processor. Microsoft describes it as deciding to buy the flexibility you wish, or for the users you would like: Enterprise Edition is restricted to core-based licensing; Business Intelligence Edition has server plus client access licensing; and traditional Edition has the choice of either the per-core or server+CAL method.
Licenses are according to at least four cores per processor, with a single core licence priced on the equivalent of 1 / 4 of the price of the old per-processor licence. The ordinary edition is restricted to support for 16 cores, and also you get an analogous limitation in case you are using the BI edition as a database. However, when you are using the Enterprise edition or the BI edition as an analysis tool, you get access to the entire cores that the OS supports. Confusingly when you are using Enterprise under the old server+CAL model (for organisations with a current SA agreement which are upgrading to SQL Server 2012), you’re limited to twenty cores per server.
A core could be either physical or virtual, and if you are virtualising a core-licensed SQL Server 2012 you’ll have at the very least four core licences per VM. One good thing about the brand new licensing model is portability, and in case you are licensing SQL Server 2012 through Software Assurance one can move licences out of your servers to hosted private clouds, to SQL Azure instances within the public cloud. Meaning you may associate SQL Server licences along with your data, ensuring you have the correct licences in place wherever your data resides.
There’s various change happening inside the datacentre, and it’s no surprise to determine Microsoft responding to these changes within the latest release of SQL Server. With Windows Server 8 and System Center 2012 both all in favour of delivering private clouds, SQL Server needed to follow. The result’s a database that supports both cloud scenarios and improves reliability for more traditional implementations. An identical changes, with support for improved in-memory operations, also form the basis of its enhanced business intelligence features. Although Power View feels like Microsoft playing catch-up with Tableau, the underlying BI architecture in SQL Server 2012 simplifies building and managing analytic services – something that both Microsoft and third parties can build on.
The result’s a well-rounded database, with support for everything from departmental servers to very large-scale enterprise clouds. With Hadoop integration just across the corner there is a lot to be said for upgrading to SQL Server 2012 – if only to organize your datacentre for the massive-data private clouds which are on their way. Just keep a watch out for the effect of per-core licensing in your costs.