New Focus on Monitoring Soil

There is nothing new about monitoring soil in arid conditions. South Africa and Israel have been doing it for decades. However climate change has increased its urgency as the world comes to terms with pressure on the food chain. Denizon decided to explore trends at the macro first world level and the micro third world one.

In America, the Coordinated National Soil Moisture Network is going ahead with plans to create a database of federal and state monitoring networks and numerical modelling techniques, with an eye on soil-moisture database integration. This is a component of the National Drought Resilience Partnership that slots into Barrack Obama?s Climate Action Plan.

This far-reaching program reaches into every corner of American life to address the twin scourges of droughts and inundation, and the agency director has called it ?probably ?… one of the most innovative inter-agency tools on the planet?. The pilot project involving remote moisture sensing and satellite observation targets Oklahoma, North Texas and surrounding areas.

Africa has similar needs but lacks America?s financial muscle. Princeton University ecohydrologist Kelly Caylor is bridging the gap in Kenya and Zambia by using cell phone technology to transmit ecodata collected by low-cost ?pulsepods?.

He deploys the pods about the size of smoke alarms to measure plants and their environment.?Aspects include soil moisture to estimate how much water they are using, and sunlight to approximate the rate of photosynthesis. Each pod holds seven to eight sensors, can operate on or above the ground, and transmits the data via sms.

While the system is working well at academic level, there is more to do before the information is useful to subsistence rural farmers living from hand to mouth. The raw data stream requires interpretation and the analysis must come through trusted channels most likely to be the government and tribal chiefs. Kelly Caylor cites the example of a sick child. The temperature reading has no use until a trusted source interprets it.

He has a vision of climate-smart agriculture where tradition gives way to global warming. He involves local farmers in his research by enrolling them when he places pods, and asking them to sms weekly weather reports to him that he correlates with the sensor data. As trust builds, he hopes to help them choose more climate-friendly crops and learn how to reallocate labour as seasons change.

Check our similar posts

Uncover hidden opportunities with energy data analytics

What springs to mind when you hear the words energy data analytics? To me, I feel like energy data analytics is not my thing. Energy data analytics, however, is of great importance to any organisation or business that wants to run more efficiently, reduce costs, and increase productivity. Energy efficiency is one of the best ways to accomplish these goals.

Energy efficiency is not about investment in expensive equipment and internal reorganization. Enormous energy saving opportunities is hidden in already existing energy data. Given that nowadays, energy data can be recorded from almost any device, a lot of data is captured regularly and therefore a lot of data is readily available.

Organisations can use this data to convert their buildings’ operations from being a cost centre to a revenue centre through reduction of energy-related spending which has a significant impact on the profitability of many businesses. All this is possible through analysis and interpretation of data to predict future events with greater accuracy. Energy data analytics therefore is about using very detailed data for further analysis, and is as a consequence, a crucial aspect of any data-driven energy management plan.

The application of Data and IT could drive significant cost savings in company-owned buildings and vehicle fleets. Virtual energy audits can be performed by combining energy meter data with other basic data about a building e.g. location, to analyse and identify potential energy savings opportunities. Investment in energy dashboards can further enable companies to have an ongoing look at where energy is being consumed in their buildings, and thus predict ways to reduce usage, not to mention that energy data analytics unlock savings opportunities and help companies to understand their everyday practices and operating requirements in a much more comprehensive manner.

Using energy data analytics can enable an organisation to: determine discrepancies between baseline and actual energy data; benchmark and compare previous performance with actual energy usage. Energy data analytics also help businesses and organisations determine whether or not their Building Management System (BMS) is operating efficiently and hitting the targeted energy usage goals. They can then use this data to investigate areas for improvement or energy efficient upgrades. When energy data analytics are closely monitored, companies tend to operate more efficiently and with better control over relevant BMS data.

Symbion Pharmacy Services? Definition of Responsibility

A ?symbion? is an organism in a symbiotic (i.e. mutually beneficial) relationship with another one. In the case of Australia?s giant Symbion Pharmacy Services, this means supplying and delivering over-counter Chemmart medicines to more than 3,000 hospital and retail pharmacies, while remaining mindful of its carbon footprint.

In 1999, the company with the tagline ?life matters? and a desire to be seen as ?a good corporate citizen? decided it was time to measure exactly what it was pumping out from 12 facilities and over 200 vehicles. This was a voluntary decision as even now there is still no carbon emissions law in Australia (although no doubt being a ?first mover? will put the company in a competitive position when this inevitably comes).

Symbion decided to install emission detection devices and connect these to a central monitoring system with the intention of managing what these measured. There were two stages to this process. First, Symbion determined its reporting requirements based on one of its larger warehouses. Following that, it established a carbon footprint for each of its wholly owned and managed facilities. This put it in a position to:

  • Analyse total emissions down to a level of detail where it understood the contribution of each source
  • Use big data management tools to identify carbon hotspots for priority remedial action
  • Inform the affected workforce, explain the monitoring system and keep them in the loop
  • Separately manage energy abatement programs such as lighting and delivery routes

The program also had productivity spin-offs in that it focused management attention on the processes behind the emissions that were ripe for material and system improvements. It also provided marketing leverage. Symbion?s customers are in the wellness business, ahead of the curve when it comes to how emissions contribute to chronic illness, and aware of the cost of this in terms of human capital.

EcoVaro could help you manage your throughputs by analysing your data on our cloud-based system. This includes trending your metrics, comparing them to your industry seasonal average, and providing you with a business-like view of how well you are doing.

Our service reduces your reliance on (and the cost of) third party audits, and simplifies the reporting process to your controlling authority. It simply makes more sense to contract your software out this way, and only pay for it when you need it.

How DevOps oils the Value Chain

DevOps ? a clipped compound of development and operations – is a way of working whereby software developers are in a team with project beneficiaries. A client centred approach extends the project plan to include the life cycle of the product or service, for which the software is developed.

We can then no longer speak of a software project for say Joe?s Accounting App. The software has no intrinsic value of its own. It follows that the software engineers are building an accounting app product. This is a small, crucially important distinction, because they are no longer in a silo with different business interests.

To take the analogy further, the developers are no longer contractors possibly trying to stretch out the process. They are members of Joe?s accounting company, and they are just as keen to get to market fast as Joe is to start earning income. DevOps uses this synergy to achieve the overarching business goal.

A Brief Introduction to OpsDev

You can skip this section if you already read this article. If not then you need to know that DevOps is a culture, not a working method. The three ?members? are the software developers, the beneficiaries, and a quality control mechanism. The developers break their task into smaller chunks instead of releasing the code to quality control as a single batch. As a result, the review process happens contiguously along these simplified lines.

Code QC Test ? ? ?
? Code QC Test ? ?
? ? Code QC Test ?
? ? ? Code QC Test
Colour Key Developers Quality Control Beneficiary

This is a marked improvement over the previously cumbersome method below.

Write the Code ? Test the Code ? Use the Code
? Evaluate, Schedule for Next Review ?

Working quickly and releasing smaller amounts of code means the OpsDev team learns quickly from mistakes, and should come to product release ahead of any competitor using the older, more linear method. The shared method of working releases huge resources in terms of user experience and in-line QC practices. Instead of being in a silo working on its own, development finds it has a richer brief and more support from being ?on the same side of the organisation?.

The Key Role that Application Program Interfaces Play

Application Program Interfaces, or API?s for short, are building blocks for software applications. Using proprietary software-bridges speeds this process up. A good example would be the PayPal applications that we find on so many websites today. API?s are not just for commercial sites, and they can reduce costs and improve efficiency considerably.

The following diagram courtesy of TIBCO illustrates how second-party applications integrate with PayPal architecture via an API fa?ade.

Working quickly and releasing smaller amounts of code means the OpsDev team learns quickly from mistakes, and should come to product release ahead of any competitor using the older, more linear method. The shared method of working releases huge resources in terms of user experience and in-line QC practices. Instead of being in a silo working on its own, development finds it has a richer brief and more support from being ?on the same side of the organisation?.

imgd2.jpg

The DevOps Revolution Continues ?

We close with some important insights from an interview with Jim Stoneham. He was general manager of the Yahoo Communities business unit, at the time Flickr became a part. ?Flickr was a codebase,? Jim recalls, ?that evolved to operate at high scale over 7 years – and continuing to scale while adding and refining features was no small challenge. During this transition, it was a huge advantage that there was such an integrated dev and ops team?

The ?maturity model? as engineers refer to DevOps status currently, enables developers to learn faster, and deploy upgrades ahead of their competitors. This means the client reaches and exceeds break-even sooner. DevOps lubricates the value chain so companies add value to a product faster. One reason it worked so well with Flickr, was the immense trust between Dev and Ops, and that is a lesson we should learn.

?We transformed from a team of employees to a team of owners. When you move at that speed, and are looking at the numbers and the results daily, your investment level radically changes. This just can’t happen in teams that release quarterly, and it’s difficult even with monthly cycles.? (Jim Stoneham)

Contact Us

  • (+353)(0)1-443-3807 – IRL
  • (+44)(0)20-7193-9751 – UK

Ready to work with Denizon?