Founder of ToolsJournal, a technology journal on software tools and services. Sudheer has overall accountability for the webiste product development and is responsible for Sales and Marketing. With a flair to write, Sudheer himself writes for toolsjournal across all journal categories.
Website URL: http://www.toolsjournal.com/
Joomla, the opensource web content management system (CMS) has announced availability of its next major version 3.0.0. While the 3.0 comes with wide range of features enabling a responsive design support and an optimized admin console that works well on mobile devices, the opensource company has urged to assess the need to upgrade from version 2.5.x.
Quantcast a 2006 big data startup that invested in efficiency innovations for Hadoop has now released Quantcast File System (QFS) to open source. Evolved from the Kosmos Distributed File System (KFS, also known as CloudStore), QFS offers a higher performance alternative to the Hadoop Data File System (HDFS) for batch data processing, significantly improving data I/O speeds and halving the disk space required to reliably store massive data sets. Fully integrable with Apache Hadoop, QFS has been live at Quantcast for four years, reliably handling petabyte-scale production workloads.
OpenStack has announced its new release 2012.2, codenamed "Folsom". Thierry Carrez (ttx) Release Manager, OpenStack has said that the 6-month Folsom journey for OpenStack core projects is towards its end, a ride which involved more than 330 contributors, implementing 185 features and fixing more than 1400 bugs in core projects alone!
QASymphony today announced it is releasing a beta of qTest, a cloud-based software testing management application. Designed for small QA teams looking to formalize their quality control procedures and workflow, qTest organizes the testing process with the goal of creating “reliable market-ready applications.”
Big data usually includes data sets with sizes beyond the ability of commonly-used software tools to capture, manage, and process the data within a tolerable elapsed time. The size of what can be called a big data is constantly a moving target and as we speak ranges from terabytes to petabytes. A new breed of tools are required to handle such huge data and such tools are not an extension or a supplement to traditional BI software. Four key things that such solutions should address are data storage, data processing, data analytics and solution integrations.
The prediction of volume of data that will be available for analysis within an organisation is easy and difficult. But the rate of growth in volumes is ever increasing and in next few years could become too big even for big data software to traverse and analyse the repository. What if you put a context around the data and analyse the contextual data? That sounds more a reliable solution and exactly what OpTier has launched within it's new big data offering OpTier BDA (Big Data Analytics), that classifies data into defined contexts and analyses relevant contextual data.
MarkLogic has released a new version of its nosql database with key focus on improvement of performance of the system and make it easier for organizations to generate secure big data results. The three key areas that are on target to improve performance of the nosql database were faster application development, powerful insightful analytics and In-database MapReduce.
What if you are able to avoid more than half of bugs you would otherwise find in testing of your critical software. Nice right!!. LDRA, the provider of automated software verification and test tools has come up with a programming rule checker that brings together a collection of rules from a broad spectrum of programming standards.