Archive

Archive for December, 2010

My predictions for 2011

December 15, 2010 1 comment

As we near the end of 2010 ( hard to believe I’m writing that actually ) , I thought I’d make some tech predictions for 2011. Not too concerned with being too accurate , as they’ll no doubt leave me embarrassed in a years’ time, but it’s fun to try an anticipate where we’re going.

1) Take 2 tablets and call me in the morning
You don’t have to be a doctor to know that tablets will be everywhere in 2011. In 2010 the iPad was the hot ticket , but in 2011 I think the Android tablets will gain significant market share. Personally however, I’m hoping that Microsoft releases a windows alternative with good battery life and a nice interface, so that I have a proper OS on the tablet. As with smart phones , more of these tablets will be brought into the enterprise, bringing in compliance and security issues , but also a need for reporting and BI solutions.

2) Intel “Oaktrail “
Speaking of tablets , the very thing that might make Windows feasible on a tablet could be the new Intel “Oaktrail” CPU , promising good performance and yet good battery life. Of course , to go with it Intel might punt their “Meego “ OS, yet another contender for your tablet Rands ( or Dollars ).


3) The future is Fusion ?
So here’s my controversial prediction , I don’t think we’ll see a compelling version of AMD Fusion in 2011. Oh, they’ll get the chips out ( probably before H1 ), but looking at what Intel did with the Core i5 600 series , which was a complete waste of time graphics wise , I’m not holding my breath for anything exciting. Lets put it this way , system builders will want proper graphics cards and these chips will probably be too expensive for the low end market ( surprise me AMD ). Fusion might work for a low power notebook scenario, but then again recent AMD chips have always lost out to Intel with regards to power consumption. I might sound like an Intel fanboy here , but I’m not. From a competition point of view , I obviously want AMD to bring out great stuff. Then again , Intel have never been good at graphics.

4) Kinect – more than a toy
Like all Star Trek fans, I’m looking forward to when we can have computers that are as cool as the Enterprise main computer. I like what I see in Kinect, but not for gaming. I’m sure that someone somewhere inside MS has bigger plans for it. My personal feeling is that the Kinect audio commands and motion detection will eventually find a way to the PC , maybe with Windows 8 ? Before you say “that’s not going to work” , how cool was the computer in Minority Report ?



4) Unified Communications
I’m starting to see more businesses deploy Unified Communications. It brings about massive costs savings and the sheer convenience factor is also a plus ( Video Conference from your desk, inteegration with Outlook and Sharepoint intranet sites ). Microsoft has recently announced Lync , and after seeing what can be done with it , I’m convinced that, in the enterprise, this is the way to go. Skype on the outside , Lync on the inside for the “connected” enterprise. I use OCS for a Live meeting video conference and don’t have to go for the physical meeting. This is cool, although I’ll have to work harder at the gym to offset this new laziness.



5) Sql Virtualization
This is a SQL blog, so here’s a SQL trend. While virtualization has been taking off for years within the enterprise, many have always been hesitant to virtualize SQL server. Web servers and App servers were the primary candidates for going virtual , mostly because SQL server was seen as too demanding , but also because previous Virtualization solutions probably didn’t perform well with SQL Server. This is starting to change, as many companies are now starting to VM their “Tier 2” database applications. The big benefits are High Availability ( with Live Migration ), hot add of CPU and memory and of course, cost savings on hardware. I think more people are now open to this, although personally I wouldn’t VM my more intensive Sql applications, and the experts seem to agree. The easiest candidates for virtualization are your older applications on older servers that don’t have hectic storage and throughput requirements. Next stop – Sql Azure in the cloud…



6) Context aware computing
Another “holy grail” for futurists, however I do not like where this is going. In Sci-Fi movies this was cool since the computer is intelligent and your interaction with it is at another level because it’s aware of its environment. It’s not even a computer that you’re interacting with, since it’s not a grey box that we’re talking about , but something more like an intelligence in your immediate vicinity. Think HAL 9000. The term “computer” is out-dated here. I think of it as your own Local Artificial Intelligence ( LAI ).I also wonder how this will work in future. When you are at your desk do you have a LAI that you interact with at your desk, and that moves with you throughout the building ? Or is there a HAL 9000 in the building for everyone ? What happens when you go home – will your house have a LAI ? But back to the topic.


Now in our real capitalist world, it seems all the research being done here is not to enhance our interaction with the LAI , but to use these concepts to increase advertising opportunities. Google recently announced it was working on “Search before you search” , which to me means “ Adverts before you search”. This isn’t anything to do with HAL 9000, but is basically Google bringing context awareness to their search engine – the first step that can be taken at this stage with our current interfaces. I suppose when this technology is developed to solve specific problems , we will start to see really cool implementations of it. But maybe other companies will release their interpretations of this in 2011 ?


Those were just some of the thoughts on my mind at the moment. Of course there are other things that are more predictable like :

1) More cores in CPU’s
2) More megapixels in cameras
3) More blades in razors

More, more , more……it never ends. At least the German car companies have ended the horsepower wars.
Advertisements
Categories: Futurism

Writing code that scales – Part 4 – Temp DB

December 8, 2010 Leave a comment

Temp DB is not normally something that SQL developers think about – after all , it’s a “background” feature of Sql Server that has nothing to do with the T-SQL code or Stored Procedure that they are working on.

However , that assessment is incorrect. Temp DB is a vital cog in the engine ( or should that be gearbox ? ) of SQL Server , and if your code places too much of strain on TempDB, it will definitely lead to performance and scalability issues later on.

While you might not be too concerned with the setup of Temp DB as a developer ( eg. Whether there are multiple file groups on multiple disks), some of your code will have a direct impact on it, the most obvious one being when you use Temp Tables. IF you are the type of developer who decides that it’s a good thing to go and create Temporary Tables whenever possible , remember that the performance you get in the development environment is not the performance you will get in production. On the production server , Temp DB will be shared with other databases.

Not only will this impact your code , but if you are using tempdb inefficiently it will impact other production applications.

Things to Consider

Before you create Temp Tables :

  1. Don’t use temp tables unnecessarily. Sometimes a clever statement ( like a CTE ) will do the job. As far as I know , a CTE won’t create a temp table if the amount of data is small enough to fit in RAM. Another option for small data sets is the table variable, although once again , if the set of data is very large, even a table variable will write to TempDB.
  2. If you have a set of data that will be referenced by multiple queries , why not persist it to a proper table ? You can always truncate it when it’s not needed anymore and it will be stored on the disk that the database uses, not the potentially high impact disk that temp db uses.

If you have to use Temp Tables :

  1. Always drop the temp table after you have used it.
  2. You don’t have to put all the columns and rows from the permanent table into the Temp Table
  3. Try not to use SELECT INTO to create the temp table. This will create locks on system objects while it figures out how to create the Temp Table
  4. Consider placing a clustered index on the TempTable. There will be a cost to populating the temp table , but if it is a large data set being referenced by many operations , it might be valuable ( in this case however I still prefer a permanent table )
Categories: SQL Server