Archive for category Employer

when in Japan – get free Wifi

On a trip in Japan and feeling disconnected? There’s a lot of Wifi around and some of it is free:
Bildschirmfoto 2014-03-15 um 22.39.03

 

Source 1: http://flets.com/freewifi/service.html

No Comments

second Tokyo Trip 2012 – Rakuten Technology Conference 2012

This October I had the pleasure to fly to Tokyo for the second time in 2012.

The development unit of Rakuten Japan was hosting the 7th Rakuten Technology Conference in Rakuten Tower 1 in Tokyo.

The schedule was packed with up to 6 tracks in parallel. From research to grass-roots-development a lot of interesting topics.

Source 1: http://tech.rakuten.co.jp/rtc2012/
Source 2: Recorded Lectures

No Comments

Mirror, Mirror on the wall

There are many things which are underestimated when team leads think about their team and possible actions to drive progress.

One of those things is that a team needs information to maintain and gain velocity. You cannot expect everyone to know just out of the blue what is important and in which direction everything is moving. To let everyone know and to develop that direction it’s important to share information as much as possible. It’s important to give everyone access to the information necessary to make a better job.

That’s why we had a build monitor at sones. We had a tool that displayed the current status of our build servers to all developers. Everytime someone committed a change, those build servers got this commit, built it and tested it with automated tests. The status of that could be seen by all developers as things happened.

So within seconds everyone could see if his commit did break something. Even better: Everyone could see. Everyone cared that the build needed to be working, that tests needed to pass. It was everyones job to do the housekeeping. When we switched from Team Foundation Server to GIT and Jenkins this status display needed to be replaced – you could immediately tell that things went from good to not-so-good in terms of build stability and automated testing.

Today I had the opportunity to take a tour of the Thomann logistics center. Standing in the support department I had this in front of me:

There were like 6 big status screens displaying incoming call status of the day, sales figures and other statistics important to those who work there. It’s a very important and integrated way to keep information flowing.

Since I am with Rakuten I thought about having a new status board set-up for my team. Something that might be inspired by the awesome status board which panic has built:

Since in addition to sones there are a lot of more things to track and handle (code, deployment, operations, overall numbers) I think such a status board will be of invaluable worth for the team.

Source 1: http://www.panic.com/blog/2010/03/the-panic-status-board/

2 Comments

Adventures in e-Commerce and technology

Oh dear. I just thought about the fact that I never really announced or talked about the fact that I changed my employee and moved to a (old) new place.

Yes that’s right, I am not with sones anymore. I am since January 1st the CTO of Rakuten Germany. When I signed the contract the company was called Tradoria – one of the first big projects I had the opportunity to work on was the so called brandchange.

A humongeous japanese based company called Rakuten bought Tradoria in the middle of 2011 and after half a year it was time to switch the brand.

As you can imagine these were busy weeks since January 1st. I had to digest a lot of existing technology and products. I met and got to know a lot of interesting people – first and foremost a great team of developers that went through almost all imagineable pains and parties to come up with a marketplace and shop system that is a perfect base for take-off.

A short word on the business-model of Rakuten – If you’re a merchant you gotta love it: Think of Rakuten as a full service provider for a merchant and customer. You as a Rakuten merchant get all the frontend and backend bliss to present and manage your products and orders. Rakuten takes care of all the nasty bits and pieces like hosting, development, telephone orders, invoicing, payment. The only thing that you as a Rakuten merchant need to do is to put in great products, gather orders and send out packages. Since Rakuten isn’t selling products on it’s own it won’t be competing with the merchants like other marketplace providers do these days.

On top of that Rakuten cares for the merchant and the customer. Just a week after that successful brandchange I attended (and spoke) at the Tradoria Live! 2012. That’s basically the merchant get-together. This year over 500 people attended this one-day conference. Think of it as a hands-on conference with features, plans, summaries of the last year and the upcoming one – every merchant is invited to come and talk to the people in person that work hard everyday to make the marketplace and shop system better.

click on it to see it big

Just 24 hours later standing on that stage I found myself here:

東京

Yep. That’s Tokyo (東京). After a very long flight we had the chance to attend a all-embracing tokyo tour before the meetings and talks would start for our team. It was an awesome and exhausting week – just about 120 hours later I was back in Germany – I must have slept for two days 🙂

Back in germany I had a lot of stuff to learn and work through. We had already moved to a wonderful house near Bamberg – it was pretty much big luck to find it. It’s actually ridiculously huge for a couple and two cats but we love it. Imagine the contrast: moving from an apartment next to a four-lane city street to the countryside just a 15 minute drive away from work with philosophical quietness all around.

Now after about half a year I am well into the process. I met a lot of high profile techies and things seem to take up speed in regards of teamplay in germany and with all the other countries. It’s a bliss to work for a group of companies that actually go through a lot of transitions while transforming from start-ups to an enterprise.

Ready for a family picture? Ready. Steady. Go!

That’s all Rakuten – that’s all on one mission: Shopping is entertainment! Empower the merchants!

Beside all that I even started to learn japanese. ただいま  🙂

No Comments

benchmarking the sones GraphDB (on Mono (sgen) and .NET)

Since we’re at it – we not only took the new Mono garbage collector through it’s paces regarding linear scaling but we also made some interesting measurements when it comes to query performance on the two .NET platform alternatives.

The same data was used as in the last article about the Mono GC. It’s basically a set of 200.000 nodes which hold between 15 to 25 edges to instances of another type of nodes. One INSERT operation means that the starting node and all edges + connected nodes are inserted at once.

We did not use any bulk loading optimizations – we just fed the sones GraphDB with the INSERT queries. We tested on two platforms – on Windows x64 we used the Microsoft .NET Framework and on Linux x64 we used a current Mono 2.7 build which soon will be replaced by the 2.8 release.

After the import was done we started the benchmarking runs. Every run was given a specified time to complete it’s job. The number of queries that were executed within this time window was logged. Each run utilized 10 simultaneously querying clients. Each client executed randomly generated queries with pre-specified complexity.

The Import

Not surprisingly both platforms are almost head-to-head in average import times. While Mono starts way faster than .NET the .NET platform is faster at the end with a larger dataset. We also measured the ram consumption on each platform and it turns out that while Mono takes 17 kbyte per complex insert operation on average the Microsoft .NET Framework only seems to take 11 kbyte per complex insert operation.

The Benchmark

Let the charts speak for themselves first:

mononet

click to enlarge

benchmark-mono-sgen
click on the picture to enlarge

benchmark-dotnet
click on the picture to enlarge

As you can see on both platforms the sones GraphDB is able to work through more than 2.000 queries per second on average. For the longest running benchmark (1800 seconds) with all the data imported .NET allows us to answer 2.339 queries per second while Mono allows us to answer 1.980 queries per second.

The Conclusion

With the new generational garbage collector Mono surely made a great leap forward. It’s impressive to see the progress the Mono team was able to make in the last months regarding performance and memory consumption. We’re already considering Mono an important part of our platform strategy – this new garbage collector and benchmark results are showing us that it’s the right thing to do!

UPDATE: There was a mishap in the “import objects per second” row of the above table.

No Comments

taking the new and shiny Mono Simple Generational Garbage Collector ( mono-sgen ) for a walk…

“Mono is a software platform designed to allow developers to easily create cross platform applications. It is an open source implementation of Microsoft’s .Net Framework based on the ECMA standards for C# and the Common Language Runtime. We feel that by embracing a successful, standardized software platform, we can lower the barriers to producing great applications for Linux.” (Source)

In other words: Mono is the platform which is needed to run the sones GraphDB on any operating system different from Windows. It included the so called “Mono Runtime” which basically is the place where the sones GraphDB “lives” to do it’s work.

Being a runtime is not an easy task. In fact it’s abilities and algorithms take a deep impact on the performance of the application that runs on top of it. When it comes to all things related to memory management the garbage collector is one of the most important parts of the runtime:

“In computer science, garbage collection (GC) is a form of automatic memory management. It is a special case of resource management, in which the limited resource being managed is memory. The garbage collector, or just collector, attempts to reclaim garbage, or memory occupied by objects that are no longer in use by the program. Garbage collection was invented by John McCarthy around 1959 to solve problems in Lisp.” (Source)

The Mono runtime has always used a simple garbage collector implementation called “Boehm-Demers-Weiser conservative garbage collector”. This implementation is mainly known for its simplicity. But as more and more data intensive applications, like the sones GraphDB, started to appear this type of garbage collector wasn’t quite up to the job.

So the Mono team started the development on a Simple Generational Garbage collector whose properties are:

  • Two generations.
  • Mostly precise scanning (stacks and registers are scanned conservatively).
  • Copying minor collector.
  • Two major collectors: Copying and Mark&Sweep.
  • Per-thread fragments for fast per-thread allocation.
  • Uses write barriers to minimize the work done on minor collections.

To fully understand what this new garbage collector does you most probably need to read this and take a look inside the mono s-gen garbage collector code.

So what we did was taking the old and the new garbage collector and our GraphDB and let them iterate through an automated test which basically runs 200.000 insert queries which result in more than 3.4 million edges between more than 120.000 objects. The results were impressive when we compared the old mono garbage collector to the new mono-sgen garbage collector.

When we plotted a basic graph of the measurements we got that:

 

monovsmono-sgen

On the x-axis it’s the number of inserts and on the y-axis it’s the time it takes to answer one query. So it’s a great measurement to see how big actually the impact of the garbage collector is on a complex application like the sones GraphDB.

The red curve is the old Boehm-Demers-Weiser conservative garbage collector built into current stable versions of mono. The blue curve is the new SGEN garbage collector which can be used by invoking Mono using the “mono-sgen” command instead of the “mono” command. Since mono-sgen is not included in any stable build yet it’s necessary to build mono from source. We documented how to do that here.

So what are we actually seeing in the chart? We can see that mono-sgen draws a fairly linear line in comparison to the old mono garbage collector. It’s easy to tell why the blue curve is rising – it’s because the number of objects is growing with each millisecond. The blue line is just what we are expecting from a hard working garbage collector. To our surprise the old garbage collector seems to have problems to cope with the number of objects over time. It spikes several times and in the end it even gets worse by spiking all over the place. That’s what we don’t want to see happening anywhere.

The conclusion is that if you are running something that does more than printing out “Hello World” on Mono you surely want to take a look at the new mono-sgen garbage collector. If you’re planning to run the sones GraphDB on Mono we highly recommend to use mono-sgen.

1 Comment

How To strip those TFS Source Control references from Visual Studio Solutions

Every once in a while you download some code and fire up your Visual Studio and find out that this particular solution was once associated to a team foundation server you don’t know or have a login to. Like when you download source code from CodePlex and you get this “Please type in your username+password for this CodePlex Team Foundation Server”.

Or maybe you’re working on your companies team foundation server and you want to put some code out in the public. You surely want to get rid of these Team Foundation Server bindings.

There’s a fairly complicated way in Visual Studio to do this but since I was able to produce unforseen side effects I do not recommend it.

So what I did was looking into those files a Visual Studio Solution and Project consists of. And I found that there are really just a few files that hold those association information. As you can see in the picture below there are several files side by side to the .sln and .csproj files – like that .vssscc and .vspscc file. Even inside the .csproj and .sln file there are hints that lead to the team foundation server – so obviously besides removing some files a tool would have to edit some files to remove the tfs association.

strip-files

So I wrote such a tool and I am going release it’s source code just beneath this article. Have fun with it. It compiles with Visual Studio and even Mono Xbuild – actually I wrote it with Monodevelop on Linux 😉 Multi-platform galore! Who would have thought of that in the founding days of the .NET platform?

Bildschirmfoto-StripTeamFoundationServerInformation - Main.cs - MonoDevelop

So this is easy – this small tool runs on command line and takes one parameter. This parameter is the path to a folder you want to traverse and remove all team foundation server associations in. So normally I take a check-out folder and run the tool on that folder and all its subfolders to remove all associations.

So if you want to have this cool tool you just have to click here: Sourcecode Download

No Comments

CeBIT started and we have a demo!

The effort of 10 days materializes in a Microsoft Surface demo. And you can see it at MSDN Developer Kino every day during CeBIT.

 

IMG_0733

No Comments

Developing on a Microsoft Surface Table

At sones I am involved in a project that works with a piece of hardware I wanted to work with for about 3 years now: the Microsoft Surface Table.

I was able to play with some tables every now and then but I never had a “business case” which contained a Surface. Now that case just came to us: sones is at the CeBIT fair this year – we were invited by Microsoft Germany to join them and present our cool technology along with theirs.

Since we already had a graph visualisation tool the idea was to bring that tool to Surface and use the platform specific touch controls and gestures.

surface_visualgraph
the VisualGraph application that gave the initial idea

The good news was that it’s easier than thought to develop an application for Surface and all parties are highly committed to the project. The bad news is that we were short on time right from the start: less than 10 days from concept to live presentation isn’t the definition of “comfortable time schedule”. And since we’re currently in the process of development it’s a continueing race.

Thankfully Microsoft is committed to a degree they even made it possible to have two great Surface and WPF ninjas who enable is to get up to speed with the project (thanks to Frank Fischer, Andrea Kohlbauer-Hug, Rainer Nasch and Denis Bauer, you guys rock!).

surface_simulator
a Surface simulator

I was able to convice UID to jump in and contribute their designing and user interface knowledge to our little project (thanks to Franz Koller and Cristian Acevedo).

During the process of development I made some pictures which will be used here and there promoting the demonstration. To give you an idea of the progress we made here’s a before and after picture:

Surface_Finger2
We started with a simple port of VisualGraph to the surface table…

Surface_Finger
…and had something better working and looking at the end of that day.

I think everyone did a great job so far and will continue to do so – a lot work to be done till CeBIT! 🙂

Source 1: http://www.sones.com
Source 2: http://www.microsoft.de
Source 3: http://www.uid.com/

3 Comments

sones GraphDB Visualization Tool

We want to show you something today: Not everybody has an idea what to think and do with a graph data structure. Not even talking about a whole graph database management system. In fact what everybody needs is something to get “in touch” with those kinds of data representations.

To make the graphs you are creating with the sones GraphDB that much more touchable we give you a sneak peak at our newest addition of the sone GraphDB toolset: the VisualGraph tool.

This tool connects to a running database and allows you to run queries on that database. The result of those queries is then presented to you in a much more natural and intuitive way, compared to the usual JSON and XML outputs. Even more: you can play with your queries and your data and see and feel what it’s like to work with a graph.

Expect this tool to be released in the next 1-2 months as open source. Everyone can use it, Everyone can benefit from it.

Oh. Almost forgot the video:

 

(Watch it in full screen if you can)

No Comments

developing a command line interface for the sones GraphDB

As you may know, my team and I are developing a graph database. A graph database is a database which is able to handle such things as the following:

510px-Sna_largesocial graph

So instead of tables with rows and columns, a graph database concentrates on objects and the connections between them and is therefore forming a graph which can be queried, traversed, whatever-you-might-want-to-do.

Lately more and more companies start realizing that their demand for storing unstructured data is growing. Reflecting on unstructured data, I always think of data which cannot single-handedly be mapped in columns and rows (e.g. tables). Normally complex relations between data are represented in relation-tables only containing this relational information. The complexity to query these data structures is humongous as the table based database needs to ‘calculate’ (JOINs, …) the relations every time they are queried. Even though modern databases cache these calculations the costs in terms of memory and cpu time are huge.

Graph databases more or less try to represent this graph of objects and edges (as the relations are called there) as native as possible. The sones GraphDB we have been working on for the last 5 years does exactly that: It stores and queries a data structure which represents a graph of objects. Our approach is to give the user a simple and easy to learn query language and handle all the object storage and object management tasks in a fully blown object oriented graph database developed from the scratch.

Since not everybody seems to have heard of graph databases, we thought it might be a good idea to lower barriers by providing personalized test instances. Everyone can get one of these without the need to install anything – a working AJAX/Javascript compatible browser will suit all needs. (get your instance here.)

Of course the user can choose between different ways to access the database test instance (like SOAP and REST) but the one we just released only needs a browser.

standard_cli

The sones GraphDB WebShell – as we call it – resembles a command line interface. The user can type a query and it is instantly executed on the database server and the results are presented in either a xml, json or text format.

graphdb-webshell

Granted – the interested user needs to know about the query language and the possible usage scenarios. Everyone can access a long and a short documentation here.

Source 1: http://en.wikipedia.org/wiki/Social_graph
Source 2: http://www.sones.com
Source 3: Long documentation
Source 4: Short documentation

2 Comments

sones GraphQueryLanguage and GraphDB Quick Reference

Since we all need documentation I thought it would be a great idea to create a one-pager which helps every user to remember important things like query language syntax.

You can download the cheatsheet here:

cheatsheet 

Download here.

No Comments

If you want to determine if your code is being compiled by Mono…

… you can use the “__MonoCS_” pre-processor flag.

mono-ifdef

1 Comment

small tool to filter iCal / iCalendar / ICS files

I am managing my appointments using Outlook on windows and iCal on OS X. Since I am not using any Exchange service right now I was happy to find out that Outlook offers a functionality to export a local calendar automatically to an iCalendar compatible ICS file. Great feature but it lacks some things I desperately need.

outlookg

Since I am managing my private and my business appointments in the same calendar, differentiating just by categories, I had a hard time configuring outlook to export a) an ics file containing all business appointments and b) an ics file containing all private appointments. It’s not possible to make the story short.

So I fired up Visual Studio as usual and wrote my own filter tool. I shall call it “iCalFilter”. It’s name is as simple as it’s functionality and code. I am releasing it under BSD license including the sources so everyone can use and modify it.

icalfilter_1

It’s a command line tool which should compile on Microsoft .NET and Mono. It takes several command line parameters like:

  1. Input-File
  2. Output-File
  3. “include” or “exclude” –> this determines if the following categories are included or excluded in the output file
  4. a list of categories separated by spaces
  5. an optional parameter “-remove-description” which, if entered, removes all descriptions from events and alarms

Easy, eh?!

Grab the Source and Binary here: https://github.com/bietiekay/iCalFilter

UPDATE: You can now access the source code on github! You can even add your changes!

1 Comment

Unser erster Presse-Artikel im heise Newsticker

Was für ein Tag. Nachdem wir vor ein paar Tagen nach viel harter Arbeit die “Technical Preview” unseres Babys “graphDB” gestartet haben hat nun auch der heise Verlag – namentlich die iX die frohe Kunde aufgegriffen und einen entsprechenden Artikel im Newsticker veröffentlich.

Wenn man sich auf jede Instanz die im Moment für Tester läuft ein Login geben lässt sieht das übrigends so aus:

hosting75instances

Wundervoll zu sehen dass die Arbeit von exzellenten Entwicklern entsprechende Würdigung durch Kunden erhält. Interesse ist gut und ich denke in Zukunft wird man noch viel von der sones graphDB hören!

Source: http://www.heise.de/newsticker/meldung/Objektorientierte-Datenbank-als-Webservice-866041.html

No Comments

So what exactly is Microsoft Research doing?

I am proud to anounce that there’s a video publicly available which shows parts and projects Microsoft Research is working on currently. It’s great to see theses projects, concepts and ideas become publicly available one by one:

“Craig Mundie, chief research and strategy officer of Microsoft, presents “Rethinking Computing,” a look a how software and information technology can help solve the most pressing global challenges we face today. Part of UW’s Computer Science and Engineering’s Distinguished Lecture Series, Mundie demonstrates a number of current and future-looking technologies that show how computer science is changing scientific exploration and discovery in exciting ways. He discusses the role of new science in solving the global energy crisis, and answer questions from the audience.”

uwtv

Source: http://www.uwtv.org/programs/displayevent.aspx?rID=30363&fID=6021

No Comments

and another Dell laptop just died…

this time a XPS M1330 blue-screened and only shows colored lines if you restart it:

Foto

If only the hardware would be as great as their service is!

1 Comment

maybe I should…

…switch this website to another weblog software in the future. The dasBlog development isn’t exactly what I would call fast-paced. It even seems that there was no movement at all for the last year at all regarding new features.

I took a short look at a current WordPress installation we did for our Developer Website at sones – and I have to admit that feature-wise this WordPress is way beyond anything I could achieve in dasBlog anytime soon.

sonesdev 

Additionally the fact that the skin of this site seems to be broken (especially for older browsers) I would have to do a skin-redesign – turns out that this is way easier in WordPress than it is in dasBlog.

1 Comment

Many 0x00s in the test run results…

We have this network share where each build from all the build-servers is dropped, including it’s test run results. It seems that we’re producing a huge number of almost empty filesystem test images which lead to astounding compression ratios:

efficiency

No Comments

wieder aktuell: sones sucht weitere engagierte Softwareentwickler

Ich hatte Ende letzten Jahres ja schon einmal ein Stellenangebot hier online gestellt. Damals mit dem Ergebnis einige sehr interessante Bewerber und letzlich auch hochmotivierte und qualifizierte Mitarbeiter gefunden zu haben.

Da wir nun wieder auf der Suche nach Verstärkung sind nutze ich wieder dieses Medium:


Die sones GmbH ist ein junges IT-Unternehmen mit Standort in Erfurt. Wir forschen in den Bereichen neuartiger Datenbank- und Speichertechnologien und entwickeln auf dieser Basis neue und innovative Produkte und Lösungen.

Am Standort Erfurt suchen wir ab sofort eine(n)

Software-Entwickler JAVA / .NET (m/w)

Sie wollen in einem jungen Team innovative Software entwickeln die im Datenbank-Segment ganz neue Wege aufzeigt? Als Software-Entwickler bei der sones GmbH haben Sie hierzu die Gelegenheit!

In einem hoch motiviertem Entwicklerteam arbeiten Sie am Kern unseres Datenbanksystems mit. Sie entwickeln Features und verbessern die Qualität der Codebasis im Hinblick auf Stabilität, Performance und Skalierbarkeit. Dabei kommen modernste Entwicklungswerkzeuge zum Einsatz.

Wenn Sie unsere hohen Ansprüche an fachliches Wissen, Eigeninitiative und Kommunikation als Herausforderung sehen – dann sind Sie bei uns herzlich willkommen!

Ihre Aufgaben:

  • Projektplanung und Projektsteuerung in Koordination mit anderen Entwicklungsbereichen
  • Analyse, Design, Implementierung neuer Produktfeatures
  • Verbesserung der Qualität existierenden Codes im Hinblick auf Stabilität, Performance und Skalierbarkeit
  • Softwaretests und Dokumentationen
  • Evaluierung neuer Technologien und Prototyping

Voraussetzungen:

  • Studium im Bereich der Informatik oder vergleichbare Ausbildung mit überzeugenden Referenzen (Projekte, Beschäftigungen)
  • Mehrjährige Erfahrung in der Objektorientierten Softwareentwicklung
  • Von Vorteil:
    • Programmierkenntnisse JAVA und .NET
    • Erfahrungen mit Testdriven Development
    • Gute Englischkenntnisse
    • Erfahrungen mit Datenbankarchitekturen und Netzwerkprogrammierung

Ihre Soft Skills:

  • Kommunikationsstärke und Bereitschaft zum dynamischen Wissens- und Informationsaustausch
  • Zuverlässigkeit und eigenständige kreative Denk- und Arbeitsweise
  • Ziel- bzw. Lösungsorientiertes Vorgehen

Wir bieten:

  • Hoch motiviertes und qualifiziertes Team
  • Ausgesprochen interessante und innovative Arbeitsgebiete
  • Viel Platz für Eigeninitiative und Kreativität
  • Die ständige Möglichkeit sich weiterzubilden und weiterzuentwickeln
  • Herausforderndes Umfeld eines High-Tech Start-Ups

Sie sind interessiert? Dann freuen wir uns über ihre aussagekräftige Bewerbung mit Angabe ihrer Gehaltsvorstellung an jobs@sones.de


Der Vollständigkeit halber das Stellenangebot nochmal als PDF:

Stellenangebot sones GmbH

No Comments

getting System.ServiceModel.AddressAccessDeniedException in automated WCF Tests

We’re currently running several build processes. So each time someone checks new code in one of the build machines gets the whole package and builds it, runs tests on it and stores the result of this whole process on the Team Foundation Server. Great stuff so far.

Until you start to do things like automated WCF Testing. We’re using the selfhosting capabilities of the WCF to start a ServiceHost and then run tests against it. This works great locally. It does not on the build machines. Even if you promote the Build-Service User to Administrator you won’t get the love.

The error you might get would look something like this:

Capture

The exception contains an URL which tells you to add the Service URL to the machines URL Access Control List. On Windows XP and 2003 you have to install the Windows Support Tools and use the httpcfg command. On Windows Vista and 2008 you should use the already installed netsh commandline tool.

Since we need to get this to work on all current and future build servers I decided to add the netsh call to the build script, which looks like this:

” border=”0″ alt=”” src=”http://www.schrankmonster.de/content/binary/WindowsLiveWriter/get.AddressAccessDeniedExceptioninautoma_9859/Capture2_thumb.png” width=”400″ height=”109″ />

Add this Target before any tests in the .proj file and you’re set.

Source 1: http://go.microsoft.com/fwlink/?LinkId=70353

No Comments

TechED EMEA 2009 – in Germany – will we be there? :-)

image

The dates are:

TechEd Berlin 2009 Developer
2-6 November – Messe Berlin, Deutschland – Germany

TechEd Berlin 2009 IT-Professionals
9-13 November – Messe Berlin, Deutschland – Germany

Comment if you’re going too!

No Comments

Bugs Bunny

Welcome our newest office Member!

From the sales department with love:

006

Bugs Bunny

No Comments

T-Online Venture Fund investiert in die sones GmbH

Bonn, 2. April 2009

Der T-Online Venture Fund gab heute ein Investment in die sones GmbH bekannt. sones sicherte sich in einer zweiten Finanzierungsrunde eine Beteiligung im einstelligen Millionenbereich. Mit den zusätzlichen Mitteln soll das Produkt bis zur endgültigen Marktreife weiterentwickelt werden.

Das 2007 gegründete Software-Unternehmen aus Erfurt hat eine völlig neue, innovative Datenbanktechnologie entwickelt. Die objektorientierte Datenbank kann die relevanten Informationen aus komplexen, unstrukturierten Datenmengen direkt miteinander verbinden und setzt somit neue Maßstäbe hinsichtlich Skalierbarkeit und Performance. Mit dieser Technologie werden komplexitätsbedingt bisher unlösbare Probleme in der Datenspeicherung und -analyse beherrschbar.

soneslogo 

“Bei sones hat uns vor allem die innovative Technologie überzeugt und der Ansatz, Bestehendes in Frage zu stellen. Somit können völlig neue Möglichkeiten des Datenmanagements geschaffen werden“, so Christoph Schmidt, Senior Vice President bei der Deutschen Telekom AG für den Bereich Personal Social Networks.

sones arbeitet derzeit am Ausbau seiner Datenbanktechnologie sowie am dazugehörigen Dateisystem. Gegen Ende dieses Jahres wird die erste Vollversion des objektorientierten Datenbankmanagementsystems (DBMS) zur Verfügung stehen. Softwareentwicklern und Partnern wird es via SDK (Software Development Kit) ermöglicht, weitreichenden Einfluss auf die Entwicklung zu nehmen und Veränderungen am System vorzunehmen. sones lädt interessierte Softwareentwickler und potentielle Partner ein, sich über die Webseite www.sones.de für das Preview- und Partner-Programm anzumelden, um die kostenlose Entwicklerversion zu erhalten und Feedback für die zukünftige Weiterentwicklung zu geben. Derzeit steht ein auf Webservices basierendes Tagging- und Recommendation-System zur Verfügung, das bereits kommerziell eingesetzt wird. „Das System kann an die jeweiligen Anforderungen in den Bereichen eCommerce, Social Networks und Portal/Content-Lösungen angepasst werden“ sagt Alexander Oelling, Leiter New Business Development bei sones. Auch hier setzt sones auf die Zusammenarbeit mit Software-Partnern, um das Produkt in die jeweiligen Webseiten zu integrieren.

Mauricio Matthesius, Geschäftsführer von sones: „Der T-Online Venture Fund hat erkannt, dass unsere revolutionäre Technologie die Zukunft der Datenbanken mitbestimmen kann, und uns in die Lage versetzt, diese Vision konsequent umzusetzen.“

Mit dem Einstieg des T-Online Venture Fund sucht das Unternehmen weitere Mitarbeiter vor allen in den Bereichen Softwareentwicklung und Vertrieb.”

Source 1: http://www.t-venture.de/de/topnews/090402_PM_TOVF_sones_dt
Source 2: http://www.sones.de

2 Comments

two times unfortunate stuff

First my Vista x64 machine at home seems to get slower by any minute it is powered on – most likely because one service is eating up all the installed memory:

5gb
(screenshot from Process Explorer)

I wasn’t able to figure out what’s the problem with it – restarting the associated services did nothing at all – killing it and restarting the services resulted in 5 gb of free memory…

And then there’s the other thing that happened this morning. We ordered a pile of 20 hard disks before christmas – and now 4 of them died.

business

Farewell you little 1 Tbyte hard disk – we never had the chance to get to know each other better.

No Comments

sones got a new website

Finally after more than two months of hard work of our marketing department the new sones.de website is online. Hurray! 😉 It looks better and it’s way more informative than the old one was.

soneswebsite

1 Comment

finally faster internet

QSC just delivered a second DSL line to our office – now even faster – 16 Mbits downstream should be enough for now. Since the german telecom could not deliver more than 3 Mbit/s we had to ask QSC for their service… overall a very good customer experience so far.

If you order a DSL line in germany from a reseller like QSC it means that a technical guy from the german telecom is sent to your place and he is doing the last mile connect – in our case the guy thought it would be enough to drop the TAE socket inside the wall… means we have to get another company to do the cabling afterwards… well.

006

No Comments

Pirates! and one more desk

Marketing got us a pirate flag – nice of ‘em, isn’t it? Since Henning has started is work he is currently sitting in our office – waiting for the other two guys to move in the office.

003

Pirates! HO!

 

003_stitch

3 Comments

New Notebooks and the office for the 3 new developers :-)

I’ve got a new work horse 🙂 A brand new Dell Latitude E6400 just arrived on monday. It’s quite a lot faster than my old one and after the fresh install it’s also a whole lot better to work with.

017 

The other news is that all the new hardware for the 3 new developers arrived this week. That means that the guys can move in! 🙂

020

3x Latitude E6400, 3x Keyboard+Mouse, 3x Sennheiser Headset, 3x 24” Widescreen

6 Comments

Sit down please.

Ha! I almost forgot to write about the cool sofa which was delivered last week (closing up to 99.99% Office completeness):

IMG_3974

 

It’s comfy and looks great – now the only thing left is the silver screen for the projector … the projector itself and the XBOX 360 is already here 🙂

P.S.: Wanna work for us?

No Comments

Using Jabber to monitor Windows EventLogs

Like every company we also got several machines working just for our infrastructural needs like Sharepoints, Activedirectory, Databases, Backup-Servers and so on.

To monitor many machines we came across the idea to use Jabber Instant Messaging to monitor the machines. For example the VPN should drop a line to specified jabber adresses if someone connects or disconnects. Every single machine is maintaining it’s own log – which means you would have to consolidate them in some ways. And since consolidation is not the masterplan – since you would need an event alarm system which sends out alarm calls if something weird is happening, you would need that alarm system too.

So we wrote (while waiting for the machines to install) several small tools which provide a gateway between syslog-ng, windows event logs and Jabber.

Since we are using this productively my Jabber Client Window looks something like this:

psi 

As you can see there are 3 machines online right now – and since these are Linux machines they also provide some status information like load averages and free memory. The Linux version was written by ahzf in perl – and obviously his library can handle the presence and status information much better than the one I used for the Windows version 🙂 – So there are no presence and status informations for the Windows machines right now.

The Windows version is written in C# and relies on the Jabber.NET library. It comes with a small setup and runs as a windows service.

jabbereventlog_windows

In the setup you have to enter the username+password of a user that can access the local Windows Event Log. After the successful setup you need to edit the config file:

editconfig

It’s XML and quite easy to understand (I think) – so you define the jabber server, the user, the password, the Users that you want to receive the messages and the EventLog you want to monitor.

After starting the service you get the startup message via the jabber server and from now on everything that is written into the Windows Event Log is sent to the accounts you specified. Easy eh?

P.S.: sourcecode release will be after we packaged everything.

Source: http://code.google.com/p/jabber-net/

7 Comments

Power Install Party

Hmm… setting up the new gear for the office infrastructure can be somewhat time consuming…

powerinstallparty2

Having relatively huge VMWare Server Host machines we’re power-installing all the virtual machines that are needed for your inhouse infrastructure…

No Comments

SONES Office 80% complete

Das Entwickler-Büro #1 ist bereits 99% fertiggestellt… auch sonst sind die Räumlichkeiten nun fast komplett 🙂

Also deshalb auch ein paar Update Panoramen:

IMG_3878_stitch

(Achim am Gerät)

IMG_3880_stitch

IMG_3910_stitch

No Comments

Stellenangebot Softwareentwickler .NET / C#

Wir stellen ein!

Die Sones GmbH ist ein junges IT-Unternehmen mit Standort in Erfurt. Wir forschen in den Bereichen neuartiger Datenbank- und Speichertechnologien und entwickeln auf dieser Basis neue und innovative Produkte und Lösungen.

Am Standort Erfurt suchen wir ab sofort eine(n)

Software-Entwickler .NET / C# (m/w)

Sie wollen in einem jungen Team innovative Software entwickeln die im Datenbank-Segment ganz neue Wege aufzeigt? Als Software-Entwickler bei der Sones GmbH haben Sie hierzu die Gelegenheit!

In einem hoch motiviertem Entwicklerteam arbeiten Sie am Kern unseres Datenbanksystems mit. Sie entwickeln Features und verbessern die Qualität der Codebasis im Hinblick auf Stabilität, Performance und Skalierbarkeit. Dabei kommen modernste Entwicklungswerkzeuge zum Einsatz.

Wenn Sie unsere hohen Ansprüche an fachliches Wissen, Eigeninitiative und Kommunikation als Herausforderung sehen – dann sind Sie bei uns herzlich willkommen!

Ihre Aufgaben:

  • Projektplanung und Projektsteuerung in Koordination mit anderen Entwicklungsbereichen
  • Analyse, Design, Implementierung neuer Produktfeatures
  • Verbesserung der Qualität existierenden Codes im Hinblick auf Stabilität, Performance und Skalierbarkeit
  • Softwaretests und Dokumentationen
  • Evaluierung neuer Technologien und Prototyping

Voraussetzungen:

  • Studium im Bereich der Informatik oder vergleichbare Ausbildung mit überzeugenden Referenzen (Projekte, Beschäftigungen)
  • Mehrjährige Erfahrung in der Objektorientierten Softwareentwicklung
  • Von Vorteil:
    • Programmierkenntnisse .NET und C#
    • Erfahrungen mit Testdriven Development
    • Gute Englischkenntnisse
    • Erfahrungen mit Datenbankarchitekturen und Netzwerkprogrammierung

Ihre Soft Skills:

  • Kommunikationsstärke und Bereitschaft zum dynamischen Wissens- und Informationsaustausch
  • Zuverlässigkeit und eigenständige kreative Denk- und Arbeitsweise
  • Ziel- bzw. Lösungsorientiertes Vorgehen

Wir bieten:

  • Hoch motiviertes und qualifiziertes Team
  • Ausgesprochen interessante und innovative Arbeitsgebiete
  • Viel Platz für Eigeninitiative und Kreativität
  • Die ständige Möglichkeit sich weiterzubilden und weiterzuentwickeln
  • Herausforderndes Umfeld eines High-Tech Start-Ups

Sie sind interessiert? Dann freuen wir uns über ihre aussagekräftige Bewerbung mit Angabe ihrer Gehaltsvorstellung an jobs@sones.de

Der Vollständigkeit halber das Stellenangebot nochmal als PDF:

2 Comments

Das neue SONES Office :-)

Seit Anfang dieser Woche sind wir ja offiziell in die neuen Räume eingezogen und dementsprechend geht es hier die ganze Zeit rund. Ikea hat schon aufgebaut und gerade ziehen die Elektriker die notwendigen Netzwerk- und Stromkabel ein. Es geht zu wie im Taubenschlag 🙂

Rein technisch ist zumindest mein Arbeitsplatz schon vollständig aufgebaut – alles in allem ist hier ein deutlich angenehmeres Arbeiten möglich als im alten Büro und es sieht alles viel schicker aus.

IMG_3831_stitch

jaja es sieht noch wild aus – aber es entsteht ja noch 🙂

Habe ich erwähnt dass das Büro indem ich sitze das einzige mit Tür auf die Dachterrasse ist? Oh das wird toll im Sommer!

IMG_3838_stitch

der Himmel ist böse grau in letzter Zeit – ich schätze ich habe heute eine interessante Heimfahrt.

No Comments

elastic windows

Amazon has done it’s thing and you can now order Windows based machinery based on EC2. That’s great news for us since we’re definitly planning to make our software also available on EC2.

“Amazon EC2 running Microsoft Windows Server® 2003 is a fast and dependable environment for deploying applications using the Microsoft Web Platform, including ASP.NET, ASP.NET AJAX, Silverlight™, and Internet Information Server (IIS). Amazon EC2 enables you to run any compatible Windows-based solution on AWS’ high-performance, reliable, cost-effective, cloud computing platform. Common Windows use cases include website and web-service hosting, high-performance computing (HPC) and data processing, media transcoding, distributed testing, ASP.NET application hosting, and any other application requiring Windows software. Amazon EC2 also now supports the SQL Server® Express and SQL Server Standard databases, and makes those offerings available to customers on an hourly basis.”

Source: http://aws.amazon.com/windows/

No Comments

the new VMWare Server 2.0…

…is such a great product.

It was easier to install than the 1.0 version and since the VMWare Server Console is gone and the WebAccess is revamped it got a great new user interface.

vmware2

Source: http://vmware.com/products/server/

5 Comments

Auf der Suche nach einem CMS…

In unserer kleinen Firma sind wir zur Zeit auch auf der Suche nach einem brauchbaren Content Management System und da kommt natürlich so ein Artikel wie gerufen: Eine Übersicht über einige der großen CMSe. Im moment favorisiert der Verantwortliche für die Webseite das Typo3 – das hab ich dann auch mal per VM zur Verfügung gestellt – aber wirklich überzeugt hat es zumindest mich nicht – nungut, ich muss damit ja auch nicht klar kommen.

“Wenigstens bin ich nicht der einzige! Und neu ist meine Problematik auch nicht: Bereits Anfang 2004 war der große Dave Shea auf der Suche nach einem geeigneten CMS, das seine (wirklich nicht besonders exotischen) Forderungen erfüllt. So ähnlich fühle ich mich auch gerade, jedoch fast 5 Jahre später. Und wie es scheint, hat sich gar nicht soviel verändert :-)”

typo3

Source: http://praegnanz.de/weblog/subjektiver-cms-einkaufsfuehrer

No Comments

When did RAID became independent?

Once upon a time I was told about that cool technology that lets you take several hard drives and glue them “together” to a single big volume. This technology was called RAID – Redundant Array of Inexpensive Disks – and that it was. It brought us greater levels of reliability and performance – and it was inexpensive compared with other technologies and since hard drive prices are falling for years and storage space is growing along with that it’s getting even cheaper than anything else you could use to store data securely. Some of us even backup to a independent RAID system.

In the beginning of this all there were several hard drive interface technologies used – mainly it was Parallel ATA and SCSI. It was widely accepted that the SCSI drives are specified for 24/7 server usage and were almost everytime faster than their consumer PATA relatives. It was accepted that if you want to build a reliable industry grade RAID you would want to use SCSI drives – the SCSI bus system even had advantages like up to 7 drives per bus compared to just 2 drives with PATA or hot-swap capabilities.

Over the last years it turned out that SATA is the new interface technology that replaces the old SCSI and PATA. There are several server grade SATA drives available now – these drives are getting cheaper, faster and bigger by the minute. So there’s not a real purpose for anything “more server than server-SATA” you might think. Again if you want to build inexpensive and redundant storage arrays there is nothing cheaper than standard or even server SATA drives. They are fast, reliable and huge.

So some years ago the industry presents: the SAS interface. It’s called “Serial Attached SCSI” and is the “new cool thing in hard disk storage”. There are some niche features that may or may not justify the existence of SAS. A fact is that SAS hard drives of the same size and speed are more expensive.

“SATA is marketed as a general-purpose successor to Parallel ATA and is now common in the consumer market, while the more expensive SAS is marketed for critical server applications.(Wikipedia)

It’s getting worse: The industry started to offer fast hard drives (15000 rpm) only for the more expensive SAS interface. The few 15k rpm SATA drives are not slower in any way than their SAS versions – but they are not widely available and all of a sudden the same price like the SAS version.

But back to the definition of RAID:

So over the years the technology made a giant leap forward and all of a sudden you find yourself using very expensive hard drives while glueing them together to giant volumes (it’s now terabytes…petabytes…). While consumer hard drives are available for about a third (at least) the price of the server version of the same drive. It seems that the widely accepted definition of inexpensive is replaced by independence. I do know that there are use cases when you want to use the fastest spinning drive available regardless of the price – but I also think that there could be affordable fast spinning drives if we shouldn’t be bothered to pay the marketing-fee that SAS brings. It’s plain marketing to make new 15k rpm drives only available for SAS and not for SATA. Marketing and nothing more.

As it turns out many industry (marketing) brains (hey, even wikipedia) are switching to a new definition of RAID. It’s now a Redundant Array of Independent Disks – which I think is a definition that could not be worse. It’s not independence we gain with the new definition.

Source 1: http://en.wikipedia.org/wiki/RAID#cite_note-1
Source 2: http://en.wikipedia.org/wiki/Serial_Attached_SCSI

No Comments

Photosynth is open for the public

Believe it or not – it’s been 2 years since I first wrote about Photosynth technology. Today Microsoft made it available to the public. It’s not a tool (yet) – like I wanted – right now but it’s built into this website – so you have to upload your pictures, they are processed and then you can browse on this website… well it’s a start for a really great technology.

“We’re pleased to announce the first full release of Photosynth, available now at photosynth.com. Photosynth takes a collection of regular photographs and reconstructs the scene or object in a 3-D environment. For those of you who have seen the videos or tried our tech preview, you could experience synths that we made in the lab and get a feel for what Photosynth is and how it works. But now, for the first time ever you can create synths from your own pictures and share them with your friends. Explore great synths from others or create a few of your own.”

halo3photosynth

It’s not going to work on anything different than Windows. So stick to the movies if you’re on anything else. But as far as I know it’ll run o

Source 1: http://photosynth.net/Default.aspx
Source 2: http://www.schrankmonster.de/PermaLink,guid,fdc3d1fb-4966-418b-83ea-1e0c12aae833.aspx

No Comments

Vote for us! Stimmt ab für SONES

Ich hatte ja darüber berichtet dass unser kleines Startup “SONES” von der INTERNET World Business zur Wahl der besten Businessidee 2008 nominiert wurde. Nun sind wir unter die ersten 20 gekommen und jeder der es möchte hat die Chance uns bei dieser Wahl zu unterstützen:

Für uns kann man unter diesem Link voten :-):

vodeinternet

Vote-link

Source: initial article

1 Comment

Student Technology Conference 2008 Agenda

The agenda of this years STC is online. You can take a look here.

stc08agenda

Source: http://www.studentconference.de/Agenda.aspx

No Comments

German Microsoft Student Technology Conference 08 announced!

“Das Datum steht fest: Unsere STC 2008 findet am 15.05.2008 statt!

Wir laden Dich herzlich nach Berlin ein und freuen uns auf einen tollen Tag mit Dir! Es erwartet Dich eine tolle Location, spannende Vorträge und Austausch mit Microsoft-Experten und –Ansprechpartnern, so dass Du ganz im Sinne des Networkings Deiner Karriere auf die Beine helfen kannst.

Zudem hast Du hier die Chance mit zu verfolgen, welches Imagine Cup Team im Software Design die deutsche Fahne beim internationalen Finale in Paris vertreten wird. Der Imagine Cup ist der weltweit größte Technologiewettbewerb für Schüler und Studierende – alle Infos zum Wettbewerb findest Du unter www.imaginecup.info.”

stc08

Stattfinden wird die STC dieses Jahr in der Kalkscheune in Berlin.

Source 1: http://www.studentconference.de
Source 2: STC 2007
Source 3: http://www.kalkscheune.de/

No Comments

TechFest 2008: Turning Ideas Into Reality

I told you, I would write about the things I am working on for the past months. And last week TechFest 2008 took place in Redmond/WA at Microsoft. Almost the whole team I am working with was there – I haven’t spoken to anybody yet personally but it seems to have gone well:

Rick Rashid, Microsoft Research senior vice president shows a prototype device with a Web-service interface developed by Microsoft researchers that runs an energy-management application that saves energy by actively monitoring the weather and energy variations. This is one of 40 exciting emerging technologies on display at Microsoft TechFest 2008 which brings researchers, customers, academics, dignitaries and employees. Redmond, Wash., March 4, 2008. Robert Sorbo/Microsoft/Handout

“Microsoft Research’s TechFest is an annual event that brings researchers from Microsoft’s labs around the world to Redmond to share their latest work with the product teams. Attendees will experience some of the freshest, most innovative technologies emerging from Microsoft’s research efforts. The event provides a forum in which product teams and researchers can discuss the incredible work occurring in the labs, thereby encouraging effective technology transfer into Microsoft products.”

research 
fast forward to minute 24…one of the interesting bits starts right there!

Source 1: http://wm.microsoft.com/ms/research/events/TechFest2008/TF08Keynote.wmv
Source 2: http://research.microsoft.com/techfest/
Source 3: http://www.schrankmonster.de/PermaLink,guid,cf5f2c46-60d2-4bb6-b58b-c50f5f3ce4d8.aspx

No Comments

sones flyer are ready to fly

For the last year and something I am affiliated with a startup called “sones“. The website already launched, the products are buyable and now the marketing machinery starts to roll.

Today three flyers came from the print shop… take a look to learn more:

contentplatform ecommerce socialnetwork

There will be more articles about sones in the future… depending on the time I will have 🙂

Source: http://www.sones.de

1 Comment

On my way to Cambridge…

Since two of my colleagues wrote about their work at Microsoft Research I wanted to write at least something about it…just like Andreas said:

“One reason I recently don’t blog too much is the fact that I am a bit restricted in what I can tell. Being involved in some exciting projects, the confidentiality of these projects does not allow much publicity.”

So really the only thing I can write about is that I am honored to work with these great people and being part of the process of creating great software.

So – for now I am on my way back to Cambridge – the next article will be written from there…

a320

Of course I will write about all the things when I am allowed to do so…

Source 1: Martin Calsyn
Source 2: Andreas Heil
Source 3: http://research.microsoft.com/ero/

No Comments

24h trip to England…

While writing this I am still in Cambridge, England and packing my stuff to take the next plane back home…

Sadly I will only see the night-sky for this trip because I started at 0600 AM GMT+1 this morning in Nürnberg and I will start again in London Stansted at 1945 GMT…

IMG_0161

3 Comments

food for a day…

Damn there’s quite a lack of decent food in UK…that’s all I got:

IMG_8569

4 Comments

on the road…erm…in the air again…

In the next hours I will be on the plane to UK for my one week stay in Cambridge… what I am doing there and everything else when I’m there…

737800

No Comments

Team Foundation Server 2008 final Feature List

vs2008

The Team Foundation Server 2008 Feature list is finalized and available… read it here:

“Administration, Operations & Setup

  • Share Point 2007 support
  • Enable use of Sharepoint on any server and any port
  • Support for MOSS 2007
  • Enable support for Reporting Services on any server and any port (new) (RTM)
  • Support for SQL Named Instances – This will allow customers to share a SQL server between multiple TFS instances, or with other applications. This has been a commonly requested feature by enterprises.
  • “Longhorn” server support – TFS will support the next version of the server (and corresponding new version of IIS) that is currently under development.
  • Sync Large Groups – This is a set of work to improve the performance and robustness of TFS’s handling large groups of users (~30,000 or more) granted permission to a TFS instance. Today this can result in a support call to recover from it.
  • Non-default ports – We’ve gotten a bunch of feedback from enterprise customers about TFS’s limited support for alternate web sites and ports running afoul of data center policies. We are going to be improving TFS’s configurability in this respect in Orcas.
  • Simplify installation – In Orcas, we will be doing a variety of things to attempt to make installing TFS easier and quicker than it is now. Improvements include eliminating the separate data-tier installation, simplifying the requirements around required domain accounts by supporting the built in machine accounts (like Network Service) where we can, etc.
  • Official testing and support for more configurations – This includes clustering, mirroring, log shipping, Virtual machine deployment, and more.
  • Support for client certificates
  • Upgrade from TFS 2005
  • Support for SQL 2008 (aka Katmai) (new) (RTM)
  • TFSDeleteProject now permanently deletes (destroys) version control content (new) (RTM)
  • New role for many operations activities (new) (RTM) – You don’t have to be server administrator to run many of the admin utilities any longer.
  • Enhancements to tfsadminutil (new) (RTM) – New capability to configure accounts, connections, etc on both TFS and the TFS proxy.

Build (more detail)

  • Support multi-threaded builds with the new MSBuild.
  • Continuous Integration – There are many components to this, including build queuing and queue management, drop management (so that users can set policies for when builds should be automatically deleted), and build triggers that allows configuration of exactly how when CI builds should be triggered, for example – every checkin, rolling build (completion of one build starts the next), etc.
  • Improved ability to specify what source, versions of source, and other build properties.
  • Improved extensibility of the build targets – such as ability to easily execute targets before and after each solution/project is built.
  • Improved ability to manage multiple build machines.
  • Stop and delete builds from within VS.
  • .NET Object model for programming against the build server.
  • Simplified ability to specify what tests get run as part of a build.
  • The ability to store build definitions anywhere in the version control hierarchy.
  • Scheduled builds – You can schedule builds to happen at specified times.
  • Improved build agent communication – We replaced .NET binary remoting with WCF web services, simplifying some configuration and security aspects.
  • Ability to run GUI tests as part of a build – Automated builds used to run tests in such a way as to prevent access to a GUI desktop.
  • New checkin policy for broken CI builds – Preventing checkin while the CI build is broken.
  • Support for HTTPS communication to the TFS server (new)
  • Continuous Integration build checkin policy (new)
  • Support for incremental gets and builds (new)

Data Warehouse

  • Add support for checkin policy overrides to the warehouse – an oversight from V1.

Migration

  • Migration toolkit – A toolkit for building conversion and mirroring solutions between TFS and other systems. In addition, we will release one or more new tools to integrate with popular alternative systems.

Version Control

  • Annotate – This is based on the TFS Annotate Power Tool but includes numerous improvements.
  • Folder Diff – Also based on the TFS Tree Diff Power Tool with numerous improvements.
  • Destroy – The ability to permanently delete version control files/folders from TFS. It can also be used to destroy the file contents while preserving the change set history.
  • Get Latest On Checkout – There have been many requests for this feature (which was a change in behavior from SourceSafe). There is now an option that allows you to specify that you want TFS to download the latest version of files when you check them out.
  • Workspace improvements – Workspaces will now support mapping a folder or file under a cloaked folder and wildcard mappings so that you can map all files in a folder without mapping sub folders. Based on experience with large projects, this will simplify workspace definitions for many people.
  • Performance improvements – A variety of Version Control performance enhancements that will improve virtually all aspects of version control performance. The gains for smaller servers/projects (< 10,000 files) will be modest. The gains for larger projects (particularly where the file count approaches 100,000’s) will be substantial.
  • Scale improvements – Fixed out of memory problems on the server when operating on more than a few hundred thousand files at a time.
  • Offline improvements – We’ve signficantly improved the experience going offline and integrated the tfpt online capability into the IDE for going back online.
  • Extranet support for the TFS Proxy – allowing you to access a local TFS proxy with a different set of credentials than the TFS server.
  • Command line help – You can now type “tf command /help” and get a console dump of the usage of that command. This is much more convenient than always being launched into the richer GUI hypertext help when you just want to remember what the options for a command are. You can still launch the GUI help by running “tf msdn”. You can get a console dump of available commands by just typing “tf help”.
  • Source Control Explorer refresh improvements – This includes less redrawing and reloading but even more important it enables updates based on changes made in other instances of TeamExploror or the command line. That’s right, if you checkout a file from the command line, any instances of TeamExplorer you have running on the same machine will automatically refresh.
  • Async loading of the Source Control Explorer (new)
  • The SCE local path can now be selected and copied (new)
  • Merge improvements (new) – Improved the logic that detects merge conflicts to generate fewer false positives and handle more scenarios.

Work Item Tracking

  • Performance & Scale improvements – A variety of improvements that will make both the work item server and client faster and able to handle larger servers.
  • Query builder usability improvements – Drop down filtering based on current project, better MRU lists, column drag & drop, shift-click mouse based multi-column sorting, etc.
  • Attachments improvements – Save button, drag & drop for adding an attachment, multi-select for attaching files.
  • Tooltips on field names contain the field name used for querying
  • Server side support for deleting work items & work item types – We didn’t have time to do client UI support for it but we plan to release a Power Tool that will take advantage of the new server side feature.
  • Support for security on the iteration hierarchy (new)

Web Access

  • Adding Web Access UI to TFS – As you’ve seen many places, we acquired devBiz and their TeamPlain Web Access product. We are releasing it as a Power Tool in the next few months and plan to release it as an official product in the Orcas timeframe. We have not figured out how the release date will line up with the Orcas date but it will be in the same general timeframe.

Bug fixes

  • In addition to all of the feature work, we’ve spent months testing the product and fixing any bugs we’ve found. We expect Orcas will have even better stability and robustness than TFS 2005.

Compatibility (no change since last time)

As Orcas is an adoption focused release, we have put a lot of emphasis on compatibility with VS2005. We are striving for near 100% compatibility. The Orcas client will be able to work with a VS2005 server and a VS2005 client will be able to work with an Orcas server. There are only a few compatibility issues.

  • Client side VS add-ins will need to be recompiled (or have policy changed) because the TFS OM assembly versions will change and add-ins will need to bind to the new assemblies. The APIs themselves are generally not changing, so we don’t expect much in the way of code changes – just recompilation.
  • Build is the only area where we plan to have some compatibility disconnects. In general, most build operations – listing build definitions, starting and stopping builds, examining build reports, etc. will work both with 2005 client -> Orcas server and Orcas client -> 2005 server. However, here are a few caveats:
    1. An Orcas TFS server will only work with an Orcas build server – so you’ll need to upgrade your build server when you upgrade your TFS server.
    2. For an VS2005 client to start a build on an Orcas server, the build definition needs to be stored at $//TeamBuildTypes/. In Orcas, you have more flexibility as to where to put them.
    3. Changes made to properties in the .proj file that are in the database in Orcas will not be updated in the database and will no longer be in sync.
    4. VS2005 will be able to start a build, but it can’t queue a build, see the list of builds in the queue, see the list of build agents, etc.
    5. An Orcas client will not be able to create a new build definition on a TFS2005 server.
    6. When starting a build, an Orcas client will not be able to change any parameters in the dialog for a TFS2005 Server.”

Source: http://blogs.msdn.com/bharry/archive/2007/08/08/final-tfs-2008-feature-list.aspx

No Comments