Author Archives: Hussain Fakhruddin

An Overview Of Blockchain Technology (or, the Internet Of Value)

 

Overview of blockchain technology

 

Over the last few years, the buzz about blockchains has gone up immensely. In 2015, banks across the globe had invested around $75 million on the technology. By the end of 2019, that figure will have jumped to $400 million – clearly underlying the fact that Satoshi Nakamoto’s (whoever he or it or they is/are) innovative distributed ledger platform is only at an early stage of growth at present – and will gain even more recognition, understanding and popularity in the near future. The mounting interest in blockchains is also reflected by the huge investments made by venture capitalists on companies in this sector. Tech biggies like Microsoft, IBM and PwC have already started to work with the technology, and in today’s discussion, we will take a look at some interesting tidbits about blockchains:

  1. The need for blockchains

    While internet services are essential in a truly ‘shared, secure economy’ – their presence is not quite a sufficient condition. By nature, web-based services are created to manage, store, transfer and monitor ‘information’ – and they are generally not engineered to create ‘value’ (i.e., internet can make business processes more efficient, but cannot change the processes per se). Blockchain, often referred to as the ‘internet of value’ (IoV, anyone?) plugs that gap effectively. Also, unlike traditional internet tools and portals, blockchains do not have any centralized servers – and do not have any fees payable for its services (since there are no intermediaries or so-called middlemen). Blockchains are required for direct, peer-to-peer exchange of value, through a robust digital channel. Implementation of this distributed ledger technology also ensures greater engagement levels (a large cross-section of people cannot afford the services of intermediaries), and also offers greater data privacy and confidentiality.

  2. Understanding a blockchain

    The name might seem rather nerdy, but blockchains actually represent a fairly simple digital technology. To put simply, a blockchain is a one-of-its kind digital ledger or recordkeeping device – that tracks and records all transactions in a network. A new ‘block’ is added to the ‘chain’, every time a new transaction takes place on a particular asset (apart from financial transactions, blockchains are also used to store activities involving cryptocurrencies, retail transactions, medical records, supply chain data, and a host of other types of transactions). Every relevant member on the network can view a transaction (say, between Person X and Person Y), although the two parties actually involved in the transaction might opt to keep their identities hidden (or use pseudonyms). In other words, a blockchain system is just like a public ledger, where the transaction records are distributed to all interested parties. The information chain (with time-stamped blocks) is made secure with public-key cryptography – and no single user can modify or delete or tamper any ‘block’ of information on his/her own.

Note: The initial block in a transaction chain on a blockchain is called the ‘genesis block’ (numbered 0 or 1). The individual blocks are connected to each other with the help of code snippets called ‘hashes’.

  1. Who invented blockchains, anyway?

    Ah, that’s something no one quite knows for sure till now. All that is known is that the first white paper, introducing blockchains, was published on 31 October 2008 – and the first ‘genesis block’ was mined back in January 2009. At first, it was widely believed that a man called Satoshi Nakamoto had invented the technology (he was also credited as the inventor of the bitcoin cryptocurrency). However, the ‘actual Satoshi’ categorically stated in 2014 that he had got ‘nothing to do’ with bitcoins or blockchains. Since then, the names of Michael Clear (cryptography graduate from Trinity College, Dublin) and Craig Steven Wright (Aussie coder and entrepreneur) have surfaced as the real names of Satoshi Nakamoto. There is also a suspicion that the three men who filed the ‘encryption patent application’ (Charles Bry, Vladimir Oksman and Neal King) might well have collectively worked under the pseudonym of Nakamoto. The three have, however, denied this. The brains behind blockchains are still unidentified, and it makes for delightful tech gossip!

  2. The operation of blockchains

    We have already explained the nature and main purposes of a blockchain. Let us now get an idea of how a distributed ledger actually works. The process starts with a transaction request from Entity 1 to Entity 2. A ‘block’ is created to represent that transaction on the network, and that ‘block’ gets automatically distributed to all the authorized, interested nodes. All these other network members have to approve the transaction (i.e., assure the validity of the transaction). Once that is done, the ‘block’ gets added to the ‘chain’, and the transaction now takes place between the two parties. The transaction details get shared on the ledger of all the members of the system (as indelible records). That, in turn, makes the entire system transparent and makes sure that everyone is aware of all the relevant transactions. All forms of digital currency transactions can be recorded on a blockchain – and the system also ensures that a bitcoin is used only once.

Note: In the financial services sector, blockchains have already proved to be instrumental for removing the (often significant) time-gap between transaction and settlement. Disintermediation is the biggest reason for this.

  1. Can blockchains be hacked?

    Anything that uses digital resources can, in theory, be hacked. However, hacking a blockchain is, for all practical purposes, almost impossible – and the technology, hence, makes transactions and big data more secure than ever before. Since blockchain is a ‘distributed technology’ and is completely decentralized, there are no centrally located servers that can be targeted by hackers. The information stored in the system is shared across all the nodes of the architecture, and is present in the computers of all involved data miners. In order to be able to successfully hack a Blockchain ledger, all the records in a chain of transaction have to be separately tweaked (every block is connected to a previous block, creating a chain-like structure). Experts opine that the cost of hacking a blockchain (in terms of invested time and resources) is generally higher than the potential benefits from doing it. Blockchains might not be hack-proof, but it’s the closest thing to being so.

  2. The bitcoin revolution

    The blockchain technology was introduced as a platform for recording transactions of the Bitcoin digital cryptocurrency (released in 2009). Bitcoin transactions are either anonymous or (more popularly) pseudonymous, and transfers are made/received at pre-specified ‘Bitcoin addresses’ (there are no mutual trusts required for the transacting parties). Due to its nature, it is extremely difficult to trace the movement of bitcoins (unlike, say, credit card payments or wire transfers). The distributed ledger is periodically edited by the network, after checking the available balances at different ‘Bitcoin addresses’. New, unconfirmed bitcoin transactions are checked at intervals of ten minutes by ‘bitcoin miners’, who allocate the necessary computing and processing power in exchange of a certain amount of the cryptocurrency. In 2016, this ‘reward’ was slashed to 12.5 bitcoins for every completed block (it started with 50 bitcoins/block; the amount is decreased after every 4 years). With a market capitalization figure of ~$67.5 billion, bitcoin is by far the most popular cryptocurrency in circulation at present. Ethereum (market cap ~$30 billion) occupies the second spot.

Note: The price of one bitcoin is more than $4000 (subject to occasional dips, like the one this July). In comparison, the rate of a unit of Ethereum varies in the $310 – $330 range.

  1. The concept of smart contracts

    In a blockchain’s rule-oriented transactions ecosystem, ‘smart contracts’ take up the role of middlemen, and ensure that everything is optimally automated. Advanced coding goes into the creation of these contracts, along with preset deal workflows, sensor services, distributed apps and custom APIs. The ‘smart contracts’ get triggered whenever certain conditions are fulfilled (for example, blood sugar levels in a medical report, or wattage in an electric meter going above a predetermined level) – and the requisite actions are initiated. From intellectual property management, banking and financial transactions, and 3D printing, to manufacturing and delivery logistics – everything can be efficiently managed by the blockchain ‘smart contracts’. In the distributed ledger, all business rules are pre-programmed by these contracts, and all members of the network are notified of the same.

Note: Solidity is a Turing-complete programming language created by Ethereum. It is used for the purpose of coding smart contracts.

  1. The role of Keys

    For recording bitcoin transactions on a shared ledger, users need to have their unique ‘private key codes’, which serve as their passwords to the blockchain system. Every private key is associated with a specific ‘bitcoin address’ (the key, hence, serves as user-credentials) – and smart contracts can be coded only after a network-member has entered his/her key. The ‘private key’ of users is stored in their respective ‘wallets’. Outside of bitcoin transactions, keys can be ‘private’ or ‘public’, on blockchain systems. In broad terms, a public key can be explained as the tool from which ‘public addresses’ on blockchains are generated (via cryptographic hashing).

  2. Impact of blockchains on employment

    Blockchains are being created and implemented by…well, anyone involved in digital transactions (including IoT), for the verification of the ‘transaction blocks’. Since this verification process becomes automated in the system, the technology can potentially replace a large percentage of the mid-level accountants – who perform the same verifications manually – across the world. What’s more, the digital representation of contracts (i.e., the ‘smart contracts’), can do away with the need for drawing up the same contracts repeatedly, and hence, the need for many lawyers. However, these perceived employment losses would be more than offset by the increased opportunities offered by the new technology – with a strong, well-trained workforce required to manage blockchains and use them in an optimized manner. In a nutshell, digital distributed ledgers would increase the demand for ‘qualified workforce’, while reducing the need for repetitive manual work. Firms are chasing greater efficiency with blockchains – and the technology is being adopted in a wide range of industries.

Note: To know more about the main application areas of blockchains (apart from financial services), click here.

    10. The downside of blockchains

Ross William Ulbricht, the founder of the world’s first cryptocurrency-based illegal ‘darknet market’ called ‘Silk Road’ (for buying/selling drugs), was sent to life imprisonment in 2015. However, the success (albeit, for a limited period) of Silk Road (v.3.0 was pulled down earlier this year) showcased a way in which the blockchain technology can be misused. As previously mentioned, the system allows for anonymous transactions (owner data can remain hidden) – and only the transaction details are entered into the shared ledger. As a result, malpractices and illegal trading with bitcoins can be initiated by shady third-party users. As the blockchain market matures over the next few years, this issue is expected to be resolved. It is a powerful technology, and developers have to ensure that it is not being used for underhand practices.

Blockchain is still a fair way off from becoming mainstream, with the market expected to mature in 2025 (it is currently in an ‘early adoption’ stage). The growth in the interim is all set to be remarkable, with both leading tech giants as well as a host of startups (Slock.it, Enigma, SETL) becoming actively involved in developing/leveraging the technology. To sum up, the open-source blockchain distributed ledger replaces centralized gateways/servers and delivers cutting-edge recordkeeping services for all types of digital transactions. Easily one of the growing technologies to watch out for!

 

 

Blockchain Beyond Financial Services: 13 Applications & Use Cases

 

Uses of blockchains in different industries

 

The financial services sector has been the earliest, and one of the biggest, adopters of the distributed ledger technology (DLT) – more popularly known as blockchain technology. Introduced in 2008 by a little-known person/team/entity named Satoshi Nakamoto, blockchains have grown rapidly in recent years – keeping pace with the burgeoning popularity of cryptocurrencies. According to a March 2017 survey, nearly 8 out of every 10 banking institutions have already started creating their unique blockchain architecture – and it has been predicted that around 15% of the major banking institutions worldwide will become active users of the technology before the end of the year. Given the fact that blockchains can potentially bring down the annual infrastructural expenses of banks worldwide by a whopping $20 billion by 2022, this rapid spurt is not surprising.

It is interesting to note though that blockchains are no longer regarded as tools whose utilities are limited to the banking and financial services sector. In 2017, 23% of all finance professionals are likely to invest >$5 million on the technology. That is considerably lower than the interest levels in the manufacturing industry (42% executives plan to invest similar amounts) as well as in the media, tech and telecom industry (27% have the same investment plans). Bitcoin technology is slowly but surely moving beyond the finance industry, and we here take a look at some other interesting applications of the breakthrough DLT:

  1. Logistics management and supply chain auditing

    Blockchains can play a very important role in enhancing the security and efficiency of the storage/transfer of products (perishable goods, in particular). Right from packing and storage, to quality testing and distribution – every activity can be recorded on the distributed ledger, and all concerned parties on the network will be notified about the same. The data that has to be entered by users will vary across the different stages of the supply chain. With blockchains, auditing and establishing the authenticity of each step in the logistics system becomes easier and way smarter than ever before.

  2. Data handling

    The volume of data that is regularly collected (and has to be correctly scanned) by us is expanding rapidly. A blockchain-based ledger setup can ease the overall process of data management by both companies as well as governmental bodies – by recording details of different entries, making the data handling tasks simpler and more transparent, and ensuring uniformly high levels of data security. Since the data records in the blockchain are typically time-stamped, the total cost of data management can be cut down upon (and chances of errors become minimal). The analytics information required for different types of applications can also be supported on a blockchain. The technology helps in compliance-related issues as well.

  3. Blockchain for IoT

    The first signs of integrating the blockchain technology in the domain of Internet of Things (IoT) came along in January 2015, when IBM released a proof-of-concept for ADEPT (in collaboration with Samsung). The concept involved the usage of the underlying design architecture of bitcoin for the creation of a decentralized IoT setup. Last August, Chronicled applied the Ethereum blockchain to create an IoT Open Registry. Real-time analytics can be connected with IoT through the edge nodes, while there are several ways in which the usability and interoperability of consumer products can be improved (the DLT can store the identities of the different goods (which would ideally have advanced NFC chipsets)). Predictive maintenance is yet another field in which IoT and blockchain technology can be effectively combined. The latter can be used to detect probable glitches/damages (and generate warning notifications) in IoT devices. The probabilities of hack attacks, hence, would be significantly reduced.

  4. Online voting

    In early-2016, it was announced that a decentralized distributed ledger system would be used in the official e-Residency platform (for companies listed on Tallinn Stock Exchange) in Estonia (the announcement was made by the republic and Nasdaq). Blockchains have the potential to play a key role in auditing the ballot boxes in e-voting, thereby recovering much of the credibility of voting systems. A single coin, representing ‘one vote’, will be assigned to each end-user (who would also have his/her credentials in a ‘wallet’). The coin can be ‘spent’ (i.e., the vote can be cast) only once. Apart from keeping fraudulent practices in check, a blockchain-supported voting infrastructure would also be less exposed to online security threats.

  5. Electricity trading

    Blockchain technology does away with the need of middlemen – and that is one of the biggest reasons behind its burgeoning popularity in peer-to-peer (P2P) electricity trading applications. In a ‘transactive energy’ setup, each user will be able to trade (buy/sell) power with neighbors, securely, promptly, and without any hassles. The implementation of blockchains makes the overall ‘power trading’ system high-fidelity. Distributed ledger technology is also witnessing healthy adoption for the exchange of RECs (renewable energy credits). There are several other use cases of blockchains in the power and energy sector, like validating energy trades, managing smart grids, analyzing and benchmarking big data, and trading ‘green certificates’. Metering local energy generation points also becomes easy.

  6. Real estate management

    With the help of a high-performance software-as-a-service platform, the tasks of recording tracking and transferring information on real estate deeds can be facilitated (that is exactly what Ubitquity does). Real-time notifications about all transactions (as and when they happen) are sent to all the relevant members of the network – ensuring complete transparency and communicability. Professional mortgage companies can also get benefits by developing and implementing blockchains. What’s more, the technology can be integrated with ‘smart home’ systems, to monitor the critical parameters of assets.

  7. Ushering in Sharing Economy 2.0

    The distributed ledger technology is, by definition, decentralized, which makes it a great tool for taking the ‘sharing economy’ to the next level. Practically every form of digital information can be supported and stored on blockchains, and activities (irrespective of their scale) can be easily monetized. That, in turn, opens up the possibilities of various types of direct, peer-to-peer transactions – like electricity trading (discussed above), hiring cabs (without the presence of middlemen like Lyft or Uber), data-sharing and providing advisory services. In an ecosystem managed by blockchains, people will be able to seamlessly connect with each other and perform direct transactions.

Note: Blockchains are also being used to deliver endorsements, and for the ranking and verification of online reputation of users. ‘The World Table’ and ‘ThanksCoin’ have emerged as major players in this niche.

  1. User identification, authentication and security

    The need for maintaining robust digital security standards is not limited to transactions in the financial sector alone. The open-source distributed ledger technology is expected to make the task of identifying and authenticating users a lot more efficient. All that individual users have to do is create unique identities on the blockchain network, to manage/control the nature and accessibility of their personal data. The technology will assign a ‘digital ID’ to each user – which would help in tracking all transactions of asset(s) in future. Credit report malpractices, cases of online identity thefts and cyber frauds will all go down with the progressive implementation of the blockchain technology. User-identification with the help of biometry tools (like fingerprint identification) is also being facilitated by underlying blockchains. A case in point for this is UniquID.

  2. Managing intellectual property rights

    A recent report revealed that, illegal downloads bring down the total proceeds from online music sales by ~85%. Other forms of digital art also face similar risks. To protect the intellectual property rights (IRP) and ensure that optimal returns are obtained from the art collections, the blockchains might well be the best tools. On the network, owners of music and other artworks can upload their stuff, provide a watermark for establishing ownership, and manage/track the transfer of these digital art items. In essence, the availability of a virtual decentralized ledger allows artists to control how, where and when their creations are being used/transferred/deployed at all times. The movement of digital art in a Blockchain framework is similar to bitcoin transactions, and there are no chances of IRP violations.

Note: Grammy-winning musician Imogen Heap has created a music-streaming platform powered by blockchain technology. It is called Mycelia.

    10. Establishment of decentralized social networks

This is one of the newest domains in which blockchain technology has started to make its presence felt. Unlike Facebook or Twitter or any other currently popular social media sites, a decentralized social network (DSN) would not involve any centralized, controlling company/entity – and user privacy will be much higher. The same software can be used for connecting multiple servers in a DSN. Synereo, DATT and Diaspora are some examples of the decentralized social networks that are coming up. Many experts feel that social networking will become more and more decentralized over the next decade or so – and if that is indeed the case, the role of blockchains over here will become increasingly prominent.

    11. Blockchain in healthcare

Infosys has already identified the medical sector as one in which distributed ledgers will have a strong role to play in the foreseeable future. E-medical records can be made more accurate and secure than ever before – and there will be no intermediaries involved for their maintenance. Blockchain’s superior handling of these records is likely to help in the creation of smarter health information exchange (HIE) models. In addition, the new technology has the power to enhance the interoperability of different health records, as well as ease out the processes of testing proof-of-concepts and conducting medical experiments. In a blockchain-supported healthcare system, the patient (with his/her information) is always at the core.

   12. Gold/silver bullion trading

A user-friendly investor platform has already been introduced by The Real Asset Company – for the purchase/sale of gold and silver bullion safely and effectively. In the commodities market, blockchains add a definitive edge, by creating secure online accounts for the purpose of buying/holding precious metals. New cryptocurrencies backed by gold/silver can also serve two key purposes: i) create an additional layer of transparency and security on top of the vaulting setup and investments, and ii) facilitating the return of these precious metals in the global monetary system (Goldbloc is a cryptocurrency that does this).

Note: Blockchains have also proved to be useful for tackling illegal activities in the diamond trading business. On a decentralized, immutable ledger, all diamond identification records and transaction details are registered. The ‘digital passport’ of diamonds (as assigned by Everledger) ensures their authenticity and helps to create trackable footprints.

   13. Role in smart agriculture

In several previous posts, we have discussed different aspects of smart, IoT-supported agriculture. This sector can also feature extensive application of the blockchain technology over the next few years. While IoT can boost productivity and yield quality in a big way, distributed ledgers can bring about improvements at various points of the agricultural ecosystem – from establishing fair-trade practices and in-depth auditing standards, to ensuring data integrity and round-the-clock compliance. The overall value chains in the primary sector should become more efficient, visible and fair, with the disruptions caused by blockchains over here.

Earlier this year, the International Airport Review has mentioned blockchain as one of the ‘six technologies to revolutionize the airport and aviation industry in 2017’. The technology can play important roles in the job markets (including support for recruitments), for creating cutting-edge network infrastructure and distributed applications, in the worldwide gaming industry (including legal gambling) and for making market forecasts more accurate. Bitnation, the first ever blockchain-technology-powered jurisdiction setup (it is, in essence, a ‘virtual nation’ in itself) was created back in 2014 – and it has citizens and stakeholders all over the globe. Blockchains are, of course, still very important in the financial services market – but their adoption in other fields is growing fast.

 

Precision Agriculture: Top 15 Challenges and Issues

Smart farming challenges

 

In the last five years or so, the total volume of investments on the agricultural sector has grown by a massive ~80%. According to experts, precision agriculture (the technique of optimizing existing inputs and fertilizers, tillage tools, fields and crops, for the purpose of improved control and measurement of farm yields) has the potential of playing a key role in meeting the incremental food demands of the growing population worldwide. A recent report estimated the value of the global precision farming market at the end of this decade at around $4.6 billion – with the CAGR between 2015 and 2020 being just a touch under 12%. In the United States alone, the market for smart agriculture software is likely to jump by more than 14% between now and 2022. However, the actual growth and proliferation of precision farming has not been as robust as was expected earlier. The sector faces several key challenges, and we turn our attentions on them in this post:

  1. Interoperability of different standards

    With more and more OEMs coming up with new and innovative agricultural IoT tools and platforms, interoperability is rapidly becoming a point of concern. The various available tools and technologies often do not follow the same technology standards/platforms – as a result of which there is a lack of uniformity in the final analysis done by end users. In many instances, the creation of additional gateway(s) becomes essential, for the translation and transfer of data across standards. As things stand now, precision agriculture (while evolving rapidly) is still, to a large extent, fragmented. The challenge lies in transforming the smart standalone devices and gateways to holistic, farmer-friendly platforms.

  2. The learning curve

    Precision farming involves the implementation of cutting-edge technology for bolstering crop growth. For the average farmer, setting up the necessary IoT architecture and sensor network for his/her field(s) can be a big ask. It has to be kept in mind that the room for error in a tech-upgraded ‘smart farm’ is minimal – and faulty management (a wrongly pressed valve here, forgetting to switch off the irrigation tank there, etc.) can be disastrous. Getting farmers thoroughly acquainted with the concept of smart farming, and the tools/devices involved in it, is of the utmost importance – before they can actually proceed with the implementation. Lack of knowledge can be dangerous.

  3. Connectivity in rural areas

    In many remote rural locations across the world (particularly in the developing countries, although several locations in the US suffers from this as well), strong, reliable internet connectivity is not available. That, in turn, thwarts the attempts to apply smart agriculture techniques at such places. Unless the network performances and bandwidth speeds are significantly improved, implementation of digital farming will remain problematic. Since many agro-sensors/gateways depend on cloud services for data transmission/storage, cloud-based computing also needs to become stronger. What’s more, in farmlands that have tall, dense trees and/or hilly terrains, reception of GPS signals becomes a big issue.

  4. Making sense from big data in agriculture

    The modern, connected agricultural farm has, literally, millions of data points. It is, however, next to impossible to monitor and manage every single data point and reading on a daily/weekly basis, over the entire growing seasons (neither is it necessary). The problem is particularly bigger in large, multi-crop lands and when there are multiple growing seasons. The onus is on the farmers to find out which data points and layers they need to track on a regular basis, and which data ‘noise’ they can afford to ignore. Digital agriculture is increasingly becoming big data-driven – but the technology is helpful only when users can ‘make sense’ of the available information.

  5. Non-awareness of the varying farm production functions

    In-depth economic analysis needs to complement internet tools, to ensure higher yields on farms. Users need to be able to define the correct production function (output as a function of key inputs, like nutrients, fertilizers, irrigation, etc.). Typically, the production function is not the same for all crops, differs in the various zones of a farm, and also changes over the crop/plant-growth cycle. Unless the farmer is aware of this varying production function, there will always remain the chance of application of inputs in incorrect amounts (spraying too much of nitrogen fertilizer, for example) – resulting in crop damages. Precision agriculture is all about optimizing output levels by making the best use of the available, limited inputs – and for that, the importance of following the production function is immense.

  6. Size of individual management zones

    Traditionally, farmers have considered their entire fields as single farming units. That approach is, however, far from being effective for the application and management of IoT in agriculture. Users have to divide their lands in several smaller ‘management zones’ – and there is quite a lot of confusion regarding the ‘correct’ size of these zones. The zones have to be divided with respect to the soil sampling requirements (different zones have varying soil qualities) and fertilizer requirements. The number of zones on a field, and their respective sizes, should depend on the overall size of the growing area. There is not much of reference work for the farmers to go by, while trying to divide their lands in these zones. As an alternative, many farmers continue to follow uniform fertilizer application and/or irrigation methods for the entire farm – leading to sub-optimal results.

  7. Barriers to entry for new firms

    Although precision farming has been a subject of considerable interest for several years now, the concept is still relatively ‘new’. As such, the big hardware/software manufacturers that entered this market at an early stage still have a definite ‘first-mover advantage’. The lowly competitiveness of the market can prevent new firms from entering this domain – with the existing big firms retaining a stranglehold. Farmers can also face problems while trying to migrate data streams from an older platform to a newer one, and there are risks of data loss. The resources and platforms provided by a big player in the agro-IoT sector might not be compatible with those provided by a smaller OEM – and that might prevent the latter from having enough clients.

  8. Lack of scalability and configuration problems

    Agricultural farms can be of different sizes. A single owner can have a large crop-growing land, along with several smaller lands. In India, nearly 33% of the total area under agriculture is accounted for by only 5% of the total number of farms – clearly highlighting the uneven nature of farm sizes over here. A farmer needs to be provided IoT tools (access points, gateways, etc.) that are completely scalable. In other words, the same technology should be applicable, and the same benefits should be available, on a large commercial farm as well as a small piece of personal garden/crop land. The need for manually configuring the setup and the devices is yet another probable point of concern. For agriculture to become truly autonomous, the technology should be self-configurable. The recent surges in artificial intelligence and M2M learning opens up the possibility for that.

  9. Energy depletion risks

    A lot has already been written about the environmental advantages of switching over to smart agriculture (precision farming is ‘greener’). However, the need for powerful data centers and gateways/hubs for the operation of the smart sensors and other gadgets can lead to heavy energy consumption – and more resources are required to replenish that energy. What’s more, the creation of new agricultural IoT tools also has an effect on the energy sector. Not surprisingly, companies have started to focus on farming technology platforms which do not cause too much of energy depletion…but there is still am long way to go in this regard.

  10. Challenge for indoor farming

    Most precision agriculture methods and resources are optimized for conventional outdoor farming. With the value of the global vertical farming industry projected to go beyond $4 billion by 2021, more attention has to be given on technology support for indoor farming. The absence of daily climatic fluctuations and regular seasons have to be taken into account, while coming up with smart indoor farming methods. The nutritional value of the outputs must not get adversely affected in any way either. Farmers need to be able to rely on the technology to create the optimal growing environment (light, temperature, water availability) for indoor plants.

  11. Technical failures and resultant damages

    The growing dependence of agriculture (or anything else, for that matter!) on technology comes with a potentially serious downside. If there is a mechanical breakdown in the hardware, or a farming IoT unit/sensor malfunctions – serious crop damages can be the result. For example, in case the smart irrigation sensors are down, plants are likely to be underwatered or overwatered. Food safety can be compromised, if the technological resources in the storage area(s) are not functioning. Even a few minutes of downtime due to a power failure can have serious consequences – particularly when backup power is not available.

  12. Mounting e-wastes

    Farms powered by smart technology have (in various extents) done away with the problems of runoff, contamination, and other channels of ecological damages. Carbon dioxide emissions have been brought down significantly (~2.0 GHt in a five-year span) as well. A new risk has cropped up though – in the form of electronic wastes (e-wastes). In 2013, the total volume of such wastes was in excess of 52 million metric tons – and the piles of discarded IoT tools and computers and outdated electronic devices are compounding this problem further. In a nutshell, the regular hardware upgrades are making the older units obsolete – and in many areas, dumping them is causing landfills. For things to be sustainable, proper arrangements for the disposal of e-waste have to be made. Soon.

  13. Loss of manual employment

    On average, 4 out of every 10 members of the global workforce are employed in the primary sector. The figures are particularly high in Oceania, Africa and Asia. As IoT in agriculture becomes more and more mainstream and things become automated – a large percentage of this agricultural labour will lose their jobs. The other sectors need to have the capacity to absorb this workforce (now rendered jobless) – and in many of the developing/underdeveloped countries, the economy is not strong enough for that to happen. There is no scope for doubting the benefits that precision agriculture brings to the table – but the large-scale displacement of manual workers can lead to dissatisfaction among people.

  14. The security factor

    The presence of malware and data thefts is a risk in practically all types of ‘connected systems’, and smart agriculture is not an exception from that. As the count of middleware technology, endpoints and IoT devices in active use in agriculture is increasing, the number of entry-points for malicious third-party programs is going up as well. Since the third-party attacks on a complex IoT system are often decentralized, detecting and removing them emerges as a big challenge. The situation becomes more complicated due to the propensity of many farmowners to opt for slightly cheaper devices and resources, which do not come with the essential safety assurances. The multiple software and API layers can cause problems as well. There is an urgent need for tighter security and provisioning policies for agricultural IoT – to make it more acceptable for users.

  15. Benefits not immediately apparent

    To get the motivation to invest on a ‘new technology’ like smart farming, users (understandably) would want to get an idea of the ROI from this technology. Unfortunately though, there is almost no way to guesstimate the benefits of precision farming over the long-run – and the benefits do not become apparent from the very outset. For this very reason, many landowners still view the use of advanced technology in agriculture as ‘risky’ and ‘uncertain’, and stay away from adopting it. With greater familiarity with agritech and comprehensive training, such fears should go away.

Smart gadgets that merely provide information about the extent of crop damages are of little use – and there is need for more ‘predictive maintenance’ tools, that would be able to anticipate damages, and help farmers avoid the same. Customization of the sensors and resources to meet the varying nutrient/water/pest control requirements of different plants is a challenge, as is getting together and comparing data from multiple farms. Farmers need to have a complete knowledge of the correct ‘nutrient algorithms’, so that the platforms/gateways can be configured optimally. There is also room for cutting down the rather frequent ‘yield map errors’, which lead to faulty output estimates.

 

The concept of precision agriculture is based on four pillars – Right place, Right source, Right quantity and Right time. It has already made a difference to agriculture and farm yield performance worldwide…and once the aforementioned challenges are overcome, its benefits will become more evident, more sustainable.

Farming 2.0: How Does IoT Help Agriculture?

 

Role of IoT in smart agriculture

 

The degree of mechanization in agriculture is going up rapidly. At the turn of the century, none of the 525 million farms across the world had sensor technology (or, for that matter, IoT in any other form). Cut to 2025, and we will witness more than 620 million sensors being used (considering the same benchmark of 525 million farms). The growth and proliferation of agricultural internet of things (Agro-IoT) is expected to pick up even more pace from then on – with ~2 billion smart agro-sensors expected to be in active use by 2050. Between 2017 and 2022, the agricultural IoT market is set to expand at a mighty impressive CAGR of around 16%-17%. In what follows, we will put the spotlight on the role of IoT in agriculture and analyze how smart technology is helping the sector:

  1. Boost to precision farming

    Traditionally, the agricultural sector has been fraught with risks. There are plenty of factors, ranging from rainfall forecasts and improper irrigation, to faulty planting/harvesting methods and poor soil quality, that can have adverse effects on overall productivity. Agricultural IoT offers farmers a great way to stay at an arm’s length from such uncertainties. With the help of advanced agro-sensors, users can get real-time, highly accurate data from their fields – on the basis of which key decisions (‘when to irrigate?’, ‘when to harvest’, etc.) can be taken. Round-the-clock access to all relevant information minimizes the chance of crop losses, and also helps growers make better, more well-rounded farming plans. With the growth of precision agriculture, the concepts of site-specific crop management (SSCM) and satellite farming (SF) are coming into the picture.

  2. The role of big data in agriculture

    In 2014, an average agricultural land had less than 200000 data points. By 2050, that figure will jump to 4 billion data points – a testimony of how quickly ‘connected farms’ will be growing during this period. In the realm of data-driven agriculture, it is increasingly becoming easier to track and monitor important parameters, like soil quality, plant nature and health, pest infestations, fertilizer usage, state of agricultural machinery, storage facilities, and a host of other factors. The better handling of chemical fertilizers, along with smart irrigation management, offers up environmental benefits as well. In essence, IoT in agriculture can very well be termed as a ‘necessary innovation’ – the technology has potential to boost both the quality and quantity of crop yields.

Note: According to an OnFarm report, integration of IoT can bolster yields by nearly 2%, bring down water-wastage by ~7%, and also cause significant energy savings (per acre).

  1. Arrival of agricultural drones

    Unmanned aerial vehicles (UAVs) are playing an increasingly important role in smart farms. People can use these farming drones to track soil and weather conditions (like sensors, they can work in collaboration with satellites and other third-party tools), as well as create detailed 3D maps of the fields. The 3D geomapping technique is particularly useful for quickly detecting existing inefficiencies in the field, and taking corrective measures immediately. Monitoring the crop life cycle and performing a supervisory role (very important in relatively large farms, where manual supervision is difficult) feature among key functions of agricultural drones. The value of the worldwide agro-drone industry is already well over $32 billion, and the figure is expected to climb sharply over the next half a decade or so.

  2. More efficient irrigation

    Lack of proper water management has been a long-standing bane of the primary sector. As highlighted in a previous post, close to 60% of water released for agriculture gets wasted – due to overwatering, runoffs, contamination, and other related issues. What’s more, instances of crops getting damaged as a result of under/over watering is also fairly common. Once again, such problems can be effectively tackled by farmers by upgrading their fields to the IoT platform. Right from tank-filling & management and valve operations, to chalking up optimized irrigation sessions/schedules – everything can be performed via advanced Sensor Observation Service (SOS) tools. The irrigation requirements of crops are estimated carefully, along with the moisture content of the soil (also, the acid content). That, in turn, helps in efficient utilization of the limited water resources (a key factor in drought-prone locations). As per reasonable estimates, integration of smart irrigation tools can save up to 50 billion gallons of water annually.

  3. Support for indoor farming

    The growing adaptation of IoT tools and software among farmers across the globe has opened up excellent opportunities for intensive indoor farming. The overall growing area can be divided into small environments, under specific growing conditions, and an open-source platform is used for the collection and instantaneous sharing of that data. The data (which includes temperature, humidity, dissolved oxygen and carbon dioxide in air, and several other critical measures) from one such environment is used to create a ‘climate recipe’ – which can then be followed for growing crops on other, similar indoor environments. Farmers have the opportunity to artificially set up conditions that would be conducive for the growth of any particular set of crops (an artificial drought, for example). Indoor farming with computers and internet services offers a high level of precision, and there hardly remains any scope for manual errors or natural elements playing spoilsport.

Note: The indoor farming methods initiated by the OpenAG Initiative uses growing environments named ‘personal food computers’.

  1. Remote management of crops, field, equipments

    It is next to impossible for farmers to manually check the health and condition of all the crops in their farm(s). Problems associated with excessive soil dryness, problematic agricultural equipments and other on-field inefficiencies can crop up too – and if these are not detected and rectified quickly, substantial loss in productivity is likely to be the result. IoT tools and smart sensors typically work as ‘middleware technology’ support, for managing all types of farm resources and connected devices on the same platform. Real-time data from the fields is relayed to a central gateway/microcontroller – and it becomes accessible to farmers through a dedicated mobile application on their smartphones. Technology enables users to keep track of what is happening on their farms on a 24×7 basis, irrespective of their precise locations at any time. Monitoring crop health or the performance of farming equipments remotely is no longer a challenge.

  2. Smart tractors get rolling

    Self-driving tractors have already started to revolutionize modern farms. These tractors (launched by companies like John Deere and Hello Tractor) are connected to the World Wide Web via built-in sensors, and can be guided by the farmers with the help of GPS navigation technology. Apart from generating crop and soil data, these high-tech tractors can help in automatic weeding and spraying of pesticides. In fact, the sensors in autonomous farm tractors can actually analyze the components in liquid nutrients, and hence, make sure that the spraying is done in the right amounts. To deliver optimal benefits, a smart tractor should be fitted with a spectrometer, a high-power infrared camera, a small computer, and a fluorescence-measurement tool for chlorophyll monitoring (in addition to, of course, the GPS receiver).  Automated tractors are still comparatively new, and they are likely to become more powerful in the foreseeable future.

Note: The growing popularity of Rowbots (for nitrogen fertilizer application on corn fields) and ‘Bonirob’ (crop inventory tracking robot) serve as classic examples of the expanding usage of robotics in agriculture.

  1. Boosts to poultry and fish farming

    The positive impacts of IoT integration in farming is not limited to crop-growing only. The fish-farming industry has been identified as one of the subdomains where technology can help in a big way. Thanks to the real-time water quality, food and stock monitoring systems and the data generated by them, farmers can take smarter, better decisions. In addition, it has also become easier to detect and treat diseases. Poultry farming is yet another area of activity where smart technology is finding widespread adaptation. Treatment of wastewater and hatchery management are two of the several activities that are becoming mechanized in this sector.

  2. Fighting pest infestations

    Specialized pest control sensors are being made by OEMs, to cut down on crop damages caused by fungi and other pests. These tools typically scan and inspect agricultural fields, and identify plant growth patterns, before identifying pest-infected problem areas (if any), enabling farmers to treat them as quickly as possible. Environmental parameters are factored into the information generated and transferred by these sensors. Thanks to the advancements of IoT practices in agriculture, it is also possible to track previous records of on-field pest infestations. Chances of crop losses due to pests, and consequent heavy financial losses to the concerned farmer, are gradually becoming things of the past.

  3. Smarter livestock management

    The concept of ‘connected cows’ has generated a lot of buzz and speculation over the last few quarters. There is already an application called eCow, which can efficiently track temperature and pH levels with the help of a rumen bolus (on a daily basis). In general too, IoT has started to help farmers in managing the animals on their farms, via embedded systems that track a wide range of pertinent information (apart from the GPS location every animal), like activities, pulse rate and temperature, tissue conditions, and other critical biomedical statistics. Since live locational information becomes possible, it also becomes easier to create geofences. The feeding routine can also be automated, while users can monitor the produce regularly. Also, web-enabled livestock monitoring systems facilitates quick detection of animal diseases (and the required treatment), identification and separation of the sick animals from herds, and timely information on animals that pass away. Creating multi-featured wireless bolus with Bluetooth support that would last the entire lifespan of the animals (fitting them with sensor collars is not a viable option) is a challenge, as is ensuring the accuracy of the data generated. In big game fields, monitoring animals of endangered species (e.g., rhinos) has also been made easier than ever before by connected technology.

Note: A lot of time can be saved, if a farmer can track the position of his/her farm animals on a computer/handheld device at all times.

    11. Food safety and logistics

The need for steadily increasing agricultural productivity to support the ever-growing global population has been well-documented. Till now, there have been many instances of perfectly healthy crops being harvested – only for them to get damaged and wasted due to improper storage and/or poor transportation/logistics facilities. With IoT monitoring systems, farmers can finally stamp down on such risks. These systems record the temperature, moisture and other conditions in the storage facilities, along with shipping timings, duration of travel, the overall logistics infrastructure, and the transports being used for crop transfer. All records from these systems are stored in the cloud, enabling users to access the same, as and when required.

   12. Predictions, forecasting, and failure avoidance

Even with full-fledged IoT integrations, agriculture is not going to become a completely ‘fail-safe’ sector. However, technology has been instrumental in lowering all types of risks as much as possible – on different fields, and for crops of practically all types. The innovative multidevice tracking/monitoring systems help in drawing up in-depth livestock and crop analytics, and a reliable failure-prediction setup (due to unfavourable soil or weather or crop health or pests or irrigation). In smart precision agriculture, more and more farmers are switching over to IoT backed models that provide accurate weather/rainfall forecasts.

Note: IoT integration can increase the performance of both horticulture and greenhouse farming, through wireless sensors and smart applications.

The stage is all set for agricultural IoT to revolutionize farming activities, taking average performance levels up by a couple of notches. There are some temporary bottlenecks, emerging from things like the frequent lack of compatibility/interoperability between sensors from different platforms, the sheer volume of big data generated (handling them can be tricky for the average farmer) and the still-existing doubts in the minds of many farmowners. As soon as these minor hitches are ironed out, the favourable effects of IoT on agriculture will become even more evident.

 

The Rise Of Climate-Smart Agriculture: An Analysis

Among the many factors that have potentially damaging effects on agricultural outputs worldwide, and consequently on farmers, the issue of climate change (CC) has got to be the most serious. Defined as ‘identifiable changes in prevailing climate (statistically testable) that persist over extended periods of time, usually decades’, CC causes fluctuations in temperature levels, deterioration of soil quality, probable decreases in the quality of yields, rise in atmospheric carbon dioxide, and can even bring about wholesale changes in yearly growing seasons. At present, close to 600 million farms across the globe are struggling to cope up with the challenges posed by climate change. It has been estimated that, by 2050, CC will lead to a 11% fall in agricultural output levels, and a whopping 20% rise in average prices. With an eye on improving the sustainability of agriculture, the need of the hour is a gradual reduction of the over-reliance of this sector on climatic factors. That, in turn, brings us to the topic of ‘climate-smart agriculture’, or CSA:

  1. The extent of the problem

    Agricultural yields have traditionally depended on the prevailing climate parameters (air temperature, sunlight, humidity, rainfall, etc.). This reliance has always added an air of uncertainty to farming, and has often caused much grief to farmers across the world. The severity of the ‘climate change’ problem is particularly high in countries which already have unfavourable weather/soil conditions. For instance, nearly 1 out of every 3 people in Guatemala suffers from food insecurity, brought about by the uncertainties of agriculture. In Mato Grosso, an apparently minor 1 ° centigrade increase in temperature can bring down annual corn and soy yields by up to 13%. A University of Leeds report has predicted that, farms in temperate and tropical areas will start to be affected (in the form of lowered yields) from 2030, due to 2 ° increases in temperature levels. Making the necessary adjustments/technology integrations to adapt to CC would require hefty investments by the developing/underdeveloped nations, to the tune of $200-$300/year (as estimated by the UNEP). The problem is big, and coping with it is a major challenge.

  2. The concept of climate-smart agriculture

    Climate change adversely affects both the quality and quantity of agricultural yields. That, in turn, causes farmers to fall into the trap of food insecurity, and consequent malnourishment and poor quality of life. The prime objective of climate-smart agriculture (CSA) is satisfactorily solving this problem, and delivering food security to everyone concerned. To attain this goal, CSA places prime focus on three factors: i) increases in farm outputs (productivity enhancement), ii) reduction in greenhouse gas emissions, to stall global warming (mitigation enhancement), and iii) boosting the resilience of crops/farms, in the face of climatic vagaries (adaptation enhancement). Interestingly, there are tradeoffs involved among these three factors (often referred to as the ‘3 pillars of CSA’). The challenge lies in integrating climatic elements in the overall agricultural plans, and optimizing the different targets by handling the tradeoff between these 3 factors in the best possible manner.

  3. The importance of geomapping in CSA

    Climate-smart agriculture has emerged as a key element of sophisticated agritech standards in general, and the application IoT and sensors in particular. The usage of smart sensors for geomapping (showcasing differences in climate conditions/soil conditions (like temperature, humidity, terrain quality, soil pH value, etc.) across locations by marking them in different colours on a map) is a classic example of this. These farm sensors can be designed to capture real-time data from weather satellites and/or other third-party elements, and send them back to a centralized gateway for detailed analysis. To ensure accurate geomapping and optimal performance of agro-sensors, the cellular network coverage has to be strong (and reliable) enough. Generally, the presence of many tall trees on a farmland can interrupt signals, and hence, cause the sensors to malfunction.

  4. Cost-benefit studies in CSA

    Full-scale integration of climate-smart practices involves moderate to heavy expenses – in the form of new tools and gadgets, as well as the need to learn how to optimally use the technological resources. Provided that CSA practices has been implemented properly, the benefits can also be huge – mainly because uncertainties caused by ‘climate change’ will then be out of the picture. In-depth economic analysis is required for this cost-benefit analysis, and to calculate the estimated potential gains from CSA, the net present value (NPV) and internal rate of return (IRR) figures are often referred to. The discount rate for making these calculations is pre-specified (~12%) – representing the money’s social opportunity cost. A viable statistical model has to be created to track ‘crop response’ levels after applying CSA practices on the field. On the cost side, both the one-time installation expenses as well as the flow of maintenance expenses have to be taken into account. The economic feasibility of CSA has already become evident in several locations worldwide. At the Trifinio reserve, for example, the IRR of CSA practices has been in excess of 140%. The results have been even more favourable in Nicaragua, where the cost-benefit ratio has jumped to 1.85 and the IRR rate has jumped to a shade under 180%. Users in Ethiopia have also reported much lower yield variability and ~22% higher outputs as a result of implementing CSA practices.

Note: The IRR rates in Trifinio and Nicaragua were calculated on the basis of vine crops in home gardens. The cost-benefit figure in Nicaragua is with respect to application of practices on basic grains.

  1. The need to reduce GHG emissions

    More than 40% of the total emission of greenhouse gases come from agriculture. To ensure the sustainability of farming activities and full food security, reduction of the emission levels is essential. Farmers need to focus squarely on bringing down GHG emissions per unit of produce (kilogram, calorie, etc.), while activities like deforestation have to be done away with as much as possible. Another key requirement in this regard is the management of trees and soil surfaces, so that the latter can serve as reliable ‘carbon sinks’. The livestock sector – which accounts for nearly 15% of the total man-made GHG emissions – has to be examined closely, along with existing rice cultivation techniques. In rice/paddy fields, overwatering (and consequent flooding) is one of the principal causes for rising methane emissions. Hence, lowering the frequency of irrigation and allowing the fields to drain properly are some basic strategies to reduce this methane production level. In general, the heavy use of machines and fertilizers in intensive farming often results in greater release of toxic GHG gases into the environment. One of the biggest sub-domains under CSA is the ‘mitigation’ of such emissions. A ‘greener’ environment will be key to sustainable agriculture.

  2. CSA practices

    We have already highlighted how implementation of CSA practices has benefitted farms in several places. Let us here take a look through the most popular ‘CSA practices’ (awareness about CSA was close to 75% by 2014). Using mulch for conservation tillage, with 67% frequency of implementation, is by far the most highly adopted ‘CSA practice’, with agroforestry with hedgerows and crop rotation activities taking up the second and third spots. Other relatively commonly implemented CSA practices include drip irrigation, contour ditch setup, putting up stone barriers, and switching over to heat/pest resistant crop varieties (maize, beans, etc.). The average increase in yields due to application of these practices hovers between the 25% and 40% mark, with conservation tillage and drip irrigation offering the maximum gains. CSA practices are expected to become more refined in future – and the advantages of using them would be even more significant.

  3. Greater adaptability is a key requirement

    The global population is rising rapidly, and agricultural outputs have to keep pace with it. Put in another way, we have to produce enough food to feed the rapidly burgeoning population levels (estimated to reach 9.6 billion by 2050). A ~70% spike in food production is required between now and 2050 – and this growth has to take place with ‘sustainable intensification’ (with minimal negative impacts on the environment, and with no adverse effects on production capabilities in future). The importance of making agriculture more ‘resilient’ and adapted to ‘climate change’ is paramount – and that involves the implementation of ‘smart farming’ standards, with advanced, internet-enabled tools and gadgets. Right from optimizing irrigation sessions, to monitoring soil quality/temperature/moisture and weather-related information – everything can be tracked with the help of sensors, examined carefully, and future courses of actions are determined on the basis of such analyses. Over the last couple of years or so, artificial intelligence (AI) and M2M (machine-to-machine) learning have emerged as vital cogs for optimized precision agriculture. For managing sensors, cellular modems/gateways/controllers are used.

Note: For a detailed analysis of smart irrigation tools and practices, read this post.

  1. Challenges to overcome

    CSA promises to offer food security and development by increasing agricultural produce and making the sector more sustainable than ever before. However, there are certain bottlenecks that impede the widespread application of CSA practices. For starters, since the gains from moving over to climate-smart farming do not usually become apparent right from the start, many farmowners remain sceptical about the return-on-investment (ROI) factor. In the developing countries, getting farmers acquainted with the necessary technology (computer intelligence and robotics, for example) also remains a considerable challenge. CSA is, by nature, data-driven – and conflicts of interest regarding data-ownership can easily crop up. Also, the low-margin nature of the agricultural sector acts as a barrier to climate-smart agriculture. Many growers view the innovations involved in CSA as ‘risky’ – and hence, remain averse to making investments on the new farming technologies. Thankfully, CSA projects around the world are being backed by public funding – and we should be able to move beyond most of these challenges soon enough.

  2. Emphasis on ‘ecosystems services’

    While modernization of agriculture has picked up pace over the past few quarters, the developments have been mostly fragmented – thanks to sectoral approaches taken by the growers. CSA looks to make things more efficient, by making agricultural advancements holistic, with prime focus on integrated plans and management. Under climate-smart agriculture, the importance of the ‘free ecosystems services’ (soil, air, water, etc.) is factored in – and due care is taken to avoid depletion/damage of these resources in any way. As a rule of thumb, CSA practices should focus on bringing about higher outputs, without affecting the quality/availability of these ‘ecosystems services’. Typically, CSA-proponents highlight the need to understand the various interdependencies among resources (soil, water, air, forests, biodiversity management), and follow a ‘landscape approach’ for improving the output levels and making farms more climate-resilient and adaptable. It also has to be kept in mind that CSA is not a ‘one-size-fits-all’, or even a ‘one-size-for-every-time’ solution. Since several related objectives have to be met, the interactions of elements with the overall landscape and ecosystem layers have to be taken into account. A CSA practice that is mighty effective for Farm A can be absolutely useless for Farm B, due to the differences in the ecosystems of the two fields.

  3. CSA and organic farming

    Organic farming and climate-smart agriculture differ primarily due to their approaches. In the former, the ‘methods’ of agriculture are specified (avoiding harsh chemical fertilizers and pesticides), while in the latter, the focus is more on the ‘goals of farming’ (namely, food security via higher yields, lower emissions, greater adaptability and sustainability). Interestingly, there are many practices involved in organic farming that are simultaneously ‘climate-smart’ as well. An example in this regard would be the emphasis on boosting organic matter in soil and improvements in natural nutrient cycling in organic farming – activities that help in carbon-preservation in the soil, and make agriculture as a whole more ‘resilient’. Proper nutrition and diet sustainability are two other factors that come under the purview of climate-smart agriculture. Organic farming is closely related to CSA – and if a comparison has to be made between the two, it’s CSA that has the more extensive benefits.

  4. CSA in practice

    There are already many instances of successful implementation of CSA practices, in different parts of the world. In Kenya, Uganda and Rwanda, dairy production has been intensified with the help of climate-search packages of practices (PoP) – with the benefits percolating to over 200 thousand farmers. The ASI rice thresher in Africa offers heavy economic advantages (easily outweighing its installation costs) – and prevents wastage of rice harvests. In Brazil, the ABC credit-initiative plan is geared to provide loans at low interests to farmers involved in low-carbon farming and other activities related to sustainability. Catfish aquaculture in Vietnam has received a serious thrust, while food-security in Africa has received a shot in the arm with the help of the ‘drought-tolerant maize for Africa’ (DTMA) project. Carbon credits were handed out to poor Kenyan farmers, in a bid to improve their land-management capabilities and standards. It is pretty much evident by these use cases that CSA has multiple points of entry – at different levels, and with varying specific goals.

It would be a folly to view climate-smart agriculture as a rigid set of technological gadgets and practices (although it can involve the application of IoT and robotics in a big way). The essence of CSA lies in seamlessly integrating solutions at the value chain, the food system, the ecosystem & landscape, and even at the policy/decision-making stages. Lowering the gender-gap and empowering women (along with other marginalized groups) is another important benefit of CSA. It has been seen that there is significant involvement of women (~43%) in agricultural activities, although their actual land-ownership figures are much lower. With ‘climate smart practices’, attempts are being made to resolve this problem, and provide everyone with equal opportunities. Coping with ‘climate changes’ effectively is now possible, thanks to the growing popularity of CSA. These practices are ideal for making agriculture more sustainable than ever before.

 

Smart Irrigation With IoT: Top 12 Things To Know

Benefits and key features of smart irrigation

The world has close to 7.53 billion people at present. A recent study found that, on average, 33% of the global population suffers from water scarcity in some form or the other. By 2030, this figure is likely to rise to 50% – clearly underlining the alarming rate at which the problem of water deficiency is expanding. Interestingly, ~70% of the total volume of water withdrawals in the world are used for irrigation, and that’s precisely where most of the water-wastage happens. Around 60% of the water meant to be used for irrigation is lost either due to evapotranspiration, or land runoff, or simply inefficient, primitive usage modes. This, in turn, brings to light the importance of smart irrigation – powered by the internet of things (IoT) – that can go a long way in managing the rising levels of water stress worldwide. In what follows, we will put the spotlight on some interesting facts about smart irrigation:

  1. The need for automated irrigation

    Smart irrigation is a key component of precision agriculture. It helps farmers avoid water wastage and improve the quality of crop growth in their fields by: a) irrigating at the correct times, b) minimizing runoffs and other wastages, and c) determining the soil moisture levels accurately (thereby, finding the irrigation requirements at any place). Replacing manual irrigation with automatic valves and systems also does away with the human error element (e.g. forgetting to turn off a valve after watering the field), and is instrumental in saving energy, time as well as resources. The installation and configuration of smart irrigation systems is, in general, fairly straightforward too – which helps the average user.

  2. The IoT-based irrigation system architecture

    A smart microcontroller (which serves as the ‘information gateway’) lies at the heart of the automated irrigation infrastructure. Soil moisture sensors and temperature sensors, which are placed on the fields, send data on a real-time basis to the microcontroller. Generally, a ‘moisture/temperature range’ is specified – and whenever the actual values are out of this range, the microcontroller automatically switches on the water pump (mounted on it with output pins). The microcontroller also has servo motors, to make sure that the pipes are actually watering the fields uniformly (no area gets clogged; no area is left too dry). The entire system can be managed by the end-user (farmer) through a dedicated mobile application. Smart irrigation makes it possible for growers to monitor and irrigate their fields remotely, without any hassles.

  3. The use of internet

    The flow of information to and from the centralized gateway (here, the microcontroller) has to be supported by stable internet services. Wireless low-power networks (e,g., LoRaWAN or Sigfox) can easily be used to power the sensors. These sensors send field information to the local computer of the user, or to a cloud network (as required). Over there, the system can combine the information with other inputs from third-party services (say, the local weather channel) to arrive at ‘intelligent irrigation decisions’. For example, if rain is in the forecast, water will not be released – even if the real-time data suggests that the field needs irrigation. Recalculations are done at regular intervals.

Note: Smart irrigation systems can save up to 45% water during the dry season, and around 80% water in the rainy season, compared to manually operated watering systems.

  1. The cost advantages

    In an automated irrigation infrastructure, there are no rooms for resource (read: water) wastage. As a result, there are cost benefits to be gained as well – by replacing the traditional watering system with a fully self-operating one. Chances of crops dying due to excessive (or insufficient) watering are minimal, which means that farmers will not have to worry about frequent plant replacement. Also, since smart agriculture in general, and smart irrigation in particular, is all about faster, healthier crop growth – the average crop cycle is shortened, and there are every chances of annual yields being higher. IoT-powered irrigation tools can be used in lawns, gardens and landscapes too.

  2. Types of sensors used

    Several types of sensors are used to parlay data to the irrigation multicontroller unit – each dedicated to capture and transmit specific data. The first are the soil moisture sensors (or SMS), which examine the dielectric constant of soil surfaces to estimate the volumetric water content in the surface (this moisture level is directly proportional to the dielectric constant reading). SMS controllers can either be ‘on-demand’ (with the capability of initiating and terminating irrigation sessions) or ‘bypass’ (with the capability to allow irrigation sessions (or bypass them), within pre-specified threshold levels). Next up are the temperature sensors, which typically use advanced Resistance Temperature Detector components (RTDs) to track soil temperature levels accurately. The ‘relay’ systems are made responsible for turning on or turning off the pump(s), as per the precise soil requirements at any time.

Note: Soil moisture sensors offer much more efficient on-field irrigation than traditional, timer-based sprinkler systems. There are no risks of overspraying or overwatering with the former.

  1. Incorporating the climatic parameters

    While there are many merits of the smart soil moisture sensors – they do not factor in weather-related factors in any way, and that remains a limitation. Significant amounts of moisture is lost due to evapotranspiration (ET; the total water lost from the plant leaves via transpiration, AND the soil via evaporation). Hence, crop-growers should ideally think beyond SMS controllers, and start using the ‘smarter’ evapotranspiration controllers or weather-based irrigation controllers (WBICs). These work with high-quality weather sensors – which receive real-time weather updates, and use the same for customizing the irrigation events. WBICs can also work with historical weather information and/or data received from satellites. Other unique characteristics of a particular crop field, right from types of plants and nature of the soil, to the ground slope and the amount of sunlight available, are taken into account – for determining the exact amount of watering a place needs at any point in time.

  2. The role of LED lights

    A smart irrigation unit, with microcontroller(s) at its core, also has pre-tested LED bulbs. When the on-field sensors report that the moisture level is has fallen below the recommended/threshold level – the bulb glows, indicating that an irrigation event has to be initiated (i.e., the sprinkler valves have to be turned on). LED lights are also an important part of ‘tank overflow control models’, which work with powerful ultrasonic sensors. As long as the pump motor is running and the water level in the tank is beneath the threshold level – the bulbs glow. In essence, the LED lights serve as handy tools to indicate the status of the pumps and sprinklers at any time. Readings from the SMS-es or the ultrasonic tank sensors can be displayed on a mobile app, for the convenience for farmers.

Note: Users can see the water level in a tank, or the soil moisture levels, on LCD screens.

  1. The placement of sensors

    It’s all very well to set up gateways and pumps and other tools, but unless the sensors are placed correctly in the fields – the ‘decisions’ taken by the smart irrigation network can very well be erroneous. Experts recommend users to make sure that the sensors remain in contact with the soil surface at all times (ruling out the presence of any ‘air gaps’), and are placed a minimum of 5 ft. away from irrigation heads, property lines, homes, and high-traffic zones. For best results, the sensors should be strategically placed in the area(s) that receive the maximum sunlight, and within the root zones of the plants (at a depth of ~3”). A soil moisture sensor has to be covered with soil, but the surrounding pressure should not be too high.

  2. The rise of smarter sprinklers

    One of the biggest advantages of switching over to a smart irrigation regime is the considerable volume of water savings. These savings can be increased even more (by around 20%), by ditching the outdated sprinkler systems, and using nozzles that can spray rotating water streams in multiple trajectories instead. The ‘smarter sprinklers’ go a long way in ensuring uniform distribution of water to all parts of the field (or a section of it), and offers much greater resistance to changes in weather conditions (wind speed, mist, etc.). The water released by these rotating-head sprinklers is mostly soaked in by the soil, thereby minimizing runoffs and other forms of wastage.

Note: Rain sensors have also already found widespread acceptance among crop-growers in different countries. These sensors double up as ‘shutdown devices’, sending signals to stop automated sprinklers at the time of (and just afterwards) heavy rainfalls.

    10. More prompt fault detection and repair

Small leaks and cracks in traditional irrigation systems (in tanks, reservoirs, etc.) can lead to considerable water loss – adding to the already mounting global water crisis. What’s more, manually detecting the source of these problems is often difficult, and a potentially time-consuming affair. Installing smart irrigation tools is a great way to keep such problems at an arm’s length. With IoT-support, these controllers can detect existing problems in any irrigation unit real-time – which, in turn, makes it easy for users to do the necessary repairs immediately. In essence, an internet-enabled irrigation system can ‘supervise’ the condition of the tanks and pumps and other units – without the user having to stay in front of a computer at all times.

    11. The cost factor

While some investment is required to implement smart irrigation solutions on a field, the sensor costs are far from being exorbitant. On average, the price of a soil moisture sensor lies in the $150-$160 range, while that of the more advanced WBICs is around $300. The rotating sprinklers (which, incidentally, are ideal for irrigating slopes) are priced on a per-unit basis (around $6 or $7). Large manufacturers also offer special rebates on the sensors and sprinkler units. Given the potential benefits of upgrading to a smart plant-watering system – the cost figures are relatively reasonable.

Note: SoCal WaterSmart is one of the leading manufacturers of irrigation controller systems. For crop-growers with minimum technical expertise, an IoT irrigation device like CropX (which reduces water wastage and helps in increasing yields) is ideal.

    12. The challenges

The adoption of IoT in agriculture has gone up immensely in recent times – but even so, the concept of ‘smart irrigation’ remains a relatively new one. Most of the existing smart irrigation controllers have many complex features and capabilities – which, while perfectly suited for large-scale commercial usage (e.g., on a golf course), are way too elaborate for small farmowners and individual gardeners. The need of the hour is to raise the awareness about, and the familiarity with, these smart irrigation systems among people – particularly since user-inputs (type of crops, soil, surface slope, etc.) are critical for the performance of these systems. Also, it has to be kept in mind that the room for error in a ‘smart system’ is much lower than in a traditional set-up. A mechanical failure or a network snag can have serious consequences.

There are plenty of things to be said in favour of smart irrigation setups. For starters, they help in optimal utilization of water – ensuring uniform watering of plants, at the right times, and in the right amounts. With the help of high-end sensors, they can also factor in climatic parameters, to make the irrigation routine more efficient. Significant savings are to be had, both in terms of much lower water wastages, as well as the diminished need for manual labour. With intelligent ‘irrigation decision-making’ capacities, advanced IoT-supported smart irrigation controllers are changing the face of agriculture. The field is evolving rapidly, and it will be interesting to track further developments in this domain over the foreseeable future.

 

Soil-less Agriculture: An Overview Of Hydroponic Farming

 

An analysis of hydroponic farming

 

For proper growth, crops require a reliable medium that would be responsible for capturing and storing the essential plant nutrients. In traditional agriculture, this role is performed by soil. However, the rapid emergence of hydroponic farming over the last couple of years or so has somewhat diminished the importance of soil in agriculture – with ‘soil-less farming’ becoming a very real possibility. A recent study revealed that the value of the global hydroponics industry will be well over $395 billion by the end of this decade (growing at a CAGR of ~6.8%). Over here, we will present some interesting facts, features and characteristics of hydroponic farming:

  1. What is hydroponics all about?

    In essence, hydroponic farming is all about growing plants and crops without soil. In this method, plant roots are brought directly in contact with liquid (generally, plain water) nutrient solutions – ensuring healthy growth. The nutrients are either reused or drained off, as required. Since there is no soil involved, the development of large root systems (to draw in nutrients) is not required – and typically, the intake of nutrients by the fibrous roots of hydroponically-grown crops is very efficient (minimal wastage). In hydroponics, soil is replaced by a reservoir or a medium made of a different material – that absorbs the necessary nutrients from the water-based solution.

Note: Soilless agriculture is not, per se, a particularly innovative concept. Reports suggest that farm research experts from the 18th century were well aware of it. Dr. William Frederick Gericke is credited for coining the word ‘hydroponics’ in 1936.

  1. What materials can be used as ‘growing medium’ in hydroponics’?

    Several different materials can be used to create the nutrient-absorbing medium (in essence, the substitute of soil). Depending on the precise requirements of crops and, of course, the farmer – materials like sand, hydrocorn, expanded shale and coco peat are used for the purpose. Clay pellets, vermiculite and rockwool are often opted for by hydroponic farmers as well. To be usable as the medium for hydroponic plant growth, the material has to be inert – and ensure that the crops have ready access to the liquid nutrient solution, light, oxygen and other essential enzymes (mixed with the nutrient solution).

Note: The common characteristic of all the hydroponic growing mediums is their ‘inertness’ – their incapability of being able to support the growth of plants on their own, in the absence of additional nutrients. The medium is only responsible for supporting the weight of crops, and to facilitate the passage of oxygen/nutrients.

  1. Do plants grow faster in hydroponic farming?

    There are very little doubts regarding that. On average, the growth rate of a plant is close to 40% higher in an hydroponic setup – compared to the traditional soil-based farming method. The annual crop yields can be as much as 75% more – making hydroponics a great technique for large-scale, commercial crop-growers (in particular). The main reason for the shorter crop cycles and much quicker time-to-harvest in hydroponic farming is the direct contact of advanced, high-quality nutrients with the plant roots. It’s like providing the best food directly to plants – ensuring significantly faster growth of the latter. Unlike soil farming, there is no wastage of nutrients, and plant growth happens in a controlled, efficient environment. Since hydroponic farming is considerably less labour-intensive than soil farming, availability of manual resources is not much of an issue either.

Note: There is no soil in hydroponic gardens, and hence, there is no need for spraying pesticides and strong chemicals – which can potentially have adverse side-effects on the crops.

  1. Types of hydroponic systems

    There are several alternative hydroponic system setups that farmers can opt for – depending on the exact requirements of the plants/crops they wish to grow.

  • The ‘Water Culture’ (also known as ‘Deep Water Culture’) is probably the simplest system, involving careful suspension of the plant roots in the water-based nutrient solution. Growers have to ensure that light does not get direct entry in the system (failing which, there can be significant algal growth), and air pumps are used to provide oxygen supply to the solution (and hence, to the roots). In this method, the plants are put in pots supported by polystyrene ‘floater’ boards. The tank, which contains the nutrient solution, is drained at regular intervals.
  • The ‘Nutrient Film Technique’ (or, NFT) can be applied to ensure optimal utilization of nutrients/resources, and superior-quality plant growth. The nutrient solution regularly passes over the tip of the roots (the solution has to be kept at a slight tilt, to facilitate smooth runoff) – and the plant gets the required oxygen both from the solution as well as from air. The solution moves from the tank to the growing medium (usually, rockwool is used in this method) through a tube – and creates a film/layer of nutrients on the medium (that’s how the method gets its name). The used solution can either be recirculated, or drained out (run-to-waste NFT).
  • The principle behind the ‘Flood and Drain’ hydroponic system is also simple enough. The growing medium is ‘flooded’ with the nutrient solution at certain time-intervals. A timer is set up in the system, to repeat the ‘flooding’ process. As the solution keeps flowing across the medium, the latter absorbs important nutrients – and that, in turn, supports the growth of plants. Typically, crops that can withstand small periods of dryness are grown by this method (also known as ‘ebb and flow’ system). A point of concern in this system is the risk of a missed alert from the timer – which can lead to excessive dryness and plant suffocation.
  • The ‘Dripper’ system has similarities with the ‘Flood and Drain’ method, particularly since this one also requires a pump (for transferring the nutrient solution) and a timer. However, in the ‘drip’ system, the solution is actually dripped on to the roots of the plants and the growing medium. Hydrocorn, clay pebbles and rockwool – which drain slowly –  are the best mediums to be used in this system. Once again, the nutrient solution can either be reused or drained off. A potential downside of ‘dripper’ systems is the chance of the drip tubes/drippers getting clogged due to the formation of nutrient particles (the problem is more common when organic nutrients are used).
  • The ‘Wicking’ method of hydroponic agriculture is also popular. In this system, vermiculite or perlite mediums are generally preferred – and farmers have to either connect the plant roots with the nutrient solution through a wick, or plunging the lower portion of the medium directly in the solution (nutrients get directly wicked to the roots). Mediums that have high absorption capacities (e.g., rockwool) are not used, since they can cause suffocation of the plants (due to excess amounts of nutrients absorbed).

        5. The Aeroponics system

Although ‘aeroponics’ is another system of hydroponic farming, its technical differences from the others merit a separate mention. In this setup, the plant roots are kept suspended in the air, and the nutrient solution is sprayed/misted on them. A pump is used to automate the misting activity after every few seconds (a timer is used in the system as well). Like the ‘Flood and Drain’ method, ‘aeroponics’ also relies heavily on air as an important source of nutrients. A pond fogger or a fine spray nozzle is used for misting the roots with the solution.

Note: AeroGarden is a classic example of commercial application of the aeroponics growing method.

  1. How does hydroponic farming do away with uncertainties?

    In a soilless agriculture setup, there are none of the uncertainties that are typically associated with traditional farming methods (soil fertility, presence of soil organisms and pests, etc). Farmers get the opportunity of forming a preset ‘nutrient regimen’ – with complete control over the nutrients (volume and quality), pH levels (the 5.8-6.8 range is considered to be ideal) and oxygen availability. Problems, if any, can be easily detected and got rid off, and the entire hydroponic system can be replicated without any hassles. Enhanced reliability is a big factor working in favour of hydroponic farming.

Note: The ‘nutrient regimen’ should primarily have six ‘macro nutrients’, along with smaller amounts of the ‘micro nutrients’. Farmers also often mix the elements of two or more hydroponic systems to create ‘hybrid systems’.

  1. Does hydroponic farming help in water conservation?

    Yes, and in a big way. In traditional soil farming, significant amounts of water gets evaporated – resulting in a wastage (both of the water as well as the nutrients present in it). Since hydroponics does not involve dirt in any way, there are no chances of evaporation or unnecessary drainage – and the water-based nutrients can easily be recycled. Experts have reported that the total volume of water required to irrigate hydroponic gardens is about one-tenth of the amount required in soil-based ecosystems. This makes the method highly suitable for growing plants in relatively arid regions (countries in the Middle East, for instance). As a rule of thumb, fresh water is used for soilless farming, and growers have to allow some time (a day, ideally) for chlorine and other chemicals in the water to be removed. After that, nutrients can be mixed to the ‘clean’ water, to create the ‘nutrient solution’. Rainwater is treated as the best possible source of water for hydroponics, while the filtered water made available through reverse osmosis is also good. Using heavily chlorinated water or hard water is an absolute no-no.

Note: The electrical conductivity (EC) level of water used for hydroponic farming should ideally be around 10.

  1. What are the best conditions for hydroponic farming?

    As already mentioned earlier, hydroponics is free from the vagaries of soil qualities, while properly prepped freshwater (chlorine, chemical removed) is best-suited for this plant-growing method. The minimum level of dissolved oxygen (DO) in air, which is an important nutrient source here, is 6 ppm (parts-per-million). For typically ‘cool season crops’, the temperature range of 10°C – 21°C is optimal, while ‘warm season plants’ grow best when the temperature is between 15°C and 27°C. Plants in a hydroponic garden also require at least 8 hours of sunlight on a daily basis (in the absence of proper sunlight, farmers can use high-intensity sodium lamps). The water should be drained once every week (plant growth and yield can be affected by contaminated water), the entire system should be leached/flushed just before harvest. In certain hydroponic systems (Flood and Drain, NFT), adjusting the pH levels regularly is also important.

Note: Apart from the removal of chlorine, calibration for pH also becomes easier when the collected freshwater is allowed to rest for a day or two.

  1. How does hydroponics help in better resource utilization and managing pollution?

    In the same space, the number of plants that can be grown by hydroponics is nearly four times more than what is possible with soil-based agriculture. The plants can be put in small pots or containers, put on a countertop – and connected to the nutrient solution tank below (the pots can be suspended in the water-based solution as well). In the traditional method, the minimum area requirement for growing plants would be a five-gallon (probably more) bucket. This opens up the possibility of more harvests and much higher yields from hydroponic farming (with the considerably faster plant growth also contributing to this). Soil farming can also pose serious environmental challenges, with water bodies being polluted by soil nutrients not used up by crops, and probable accumulation of salt in the underground water (salination). The runoff of chemical nutrients to lakes/rivers can lead to deoxygenation – putting the life of water animals in threat. Use of pesticides brings in risks of air pollution. In hydroponics, there is no soil, and no such potential environmental hazards. It’s a ‘green’ method!

Note: There are many areas with harsh climatic conditions, where soil maintenance becomes a huge concern for farmers. Hydroponic farming is a great option in such places. Weeding is yet another task that hydroponic farmers need not worry about.

     10. Are there any associated risks or challenges?

Hydroponics is a simple method of alternative farming (the most technical thing about it is probably its name!) – and there are not much in the way of risks in this system. However, growers have to make sure that the plants can access the nutrient solution at all times – otherwise, the roots can become too dry very quickly. In the ‘dripper’ or the ‘Food and Drain’ systems, special care has to be taken about the reliability of the alarms. Any malfunction in the latter can cause serious damages to the plant. In general, while the automated nature of plant feeding and growth has a host of advantages (higher yields, faster growth, better quality, optimal nutrient use, etc.) – but things can quickly go bad in the event of a lengthy mechanical failure. Hydroponic plants tend to be smaller in size (and with smaller, less complex roots) than plants grown in traditional soil fields.

Note: Hydroponics can be applied for both indoor and outdoor farming, including plant growing in greenhouses. The method is best suited for plants that have shallow root systems. A wide range of fruits, houseplants and veggies, right from spinach and herbs, to lettuce and radish, can be grown with hydroponic farming.

While hydroponics might seem to be a variant of organic farming at first, the two are actually entirely different methods. The former does not have any role for soil, while organic farming requires the conversion of nutrients by the soil (so that they can be absorbed by plant roots). In terms of nutritional values and ecological benefits too, hydroponics offer much greater benefits than conventional soil-based farming. The systems are easy to set up, making DIY hydroponics relatively simple too. For professional crop-growers as well as general gardening enthusiasts, it is now possible to grow healthy plants…without having to get their hands dirty!

ARKit and Core ML: An Overview Of The New Apple Frameworks

At this year’s Worldwide Developers Conference (WWDC; 5-9 June), Apple announced two new frameworks – an augmented reality (AR) developer kit named ARKit, and a machine learning API called Core ML. The frameworks will be among the key features of iOS 11 (the third beta was released earlier this month) – the latest version of the company’s mobile platform. In today’s discussion, we will give you a brief idea about ARKit first, and move on to Core ML next:

ARKit

“Augmented reality is going to help us mix the digital and physical in new ways.”

— Mark Zuckerberg, Facebook F8 Conference

Over the years, there has hardly been any activity from Apple in the virtual reality (VR) and augmented reality (AR) realms. As major rivals like Amazon (with Alexa), Microsoft (with HoloLens) and Google (with Project Tango) have upped their respective games, all that we got from Apple in the form of AR tools were Siri and iOS 10’s ‘intelligent’ Quicktype. The scene has changed with the arrival of ARKit, which has been billed as the ‘largest AR platform in the world’ by Apple.

  1. More power to developers and apps

    For third-party mobile app developers working on the iOS platform, ARKit brings in never-before capabilities to blend in AR experiences within their applications. With the help of the framework resources, motion sensor, and of course the camera of the iPhone/iPad, devs will be able to make their software seamlessly interact with the actual environment (read: digital tools will enrichen the real world). The role of AR in Pokemon Go was only the tip of the iceberg (many users even reported that the gameplay got enhanced when AR was turned off) – and ARKit will help developers go all in to integrate augmented reality in their apps, to make the latter unique, useful and more popular than ever before.

  2. The fight with Google and Facebook

    Apple is late to the AR game, there are no two ways about that. For ARKit to be able to make a mark, it has to offer something more than the AR-based solutions of Facebook and Google, which are both established players in this domain. Interestingly, Apple’s new framework DOES seem to have a key advantage: it is compatible with all existing iDevices running on the A9 or A10 chip, while for integrating Project Tango, Android OEMs have to create separate, customized hardware. Also, Facebook’s AR activity is, at least till now, confined to its own Camera app only. ARKit, on the other hand, will be pushed out to all iPhone/iPad applications. In terms of existing user-base, Apple certainly has a stronghold.

  3. How does ARKit work?

    The ARKit framework does not form three-dimensional models to deliver high-end AR-experiences to app-users. Instead, it uses a revolutionary technology called Visual Inertial Odometry (or, VIO) – which has the capability to combine information from the CoreMotion sensor and the device camera, to track the movement of the smartphone/tablet in a room. Put in another way, a set of points are traced in the environment by ARKit – and these points are tracked as the device is moved. This functionality is expected to help developers create customized virtual world experiences over the real environment, with their new apps (the superior processing speeds of A9/A10 chips is also an important factor). ARKit does not need any external calibration either, and should typically generate highly accurate data.

Note: The process in which ARKit integrates virtual elements into the real world with the help of projected geometry is known as ‘world tracking’.

      4. The role of dual cameras

Apple’s decision to do away with the headphone jack in iPhone 7 raised quite a few eyebrows. There has been considerable curiosity about the presence of dual cameras in the handset. The announcement of ARKit fully justifies the latter decision though. With the help of the dual cameras, the capability to correctly gauge the distance between two viewpoints (from the device’s current location) become easier, and triangulation of this distance is also possible. The two cameras, working together, offers improved depth sensing, and obviously, better zooming features as well. This, in turn, helps the handset in creating pinpoint-accurate depth maps, and differentiate between background objects and foreground objects.

     5. Finding planes and estimating lights

Floors and tables and other basic forms of horizontal planes can be detected by the ARKit framework. After the detection, the device (iPhone/iPad) can be used to put virtual objects on the tracked surface/plane. The plane detection (scene understanding) is done by devices with the help of the ‘scenes’ generated by the built-in camera. What’s more, the framework can also determine the availability of light in different scenes, and ensure that the virtual objects have just the right amount of lighting to appear natural in any particular scene. From tracking the perspective and scale of viewpoints, to shadow correction and performing hit-tests of digitized objects – the ‘world tracking’ functionality of ARKit can do them all.

Note: Once the scene understanding and estimation of lighting is done, virtual elements can actually be rendered to the real environment.

     6. The limitations of ARKit

ARKit is Apple’s first native foray into the world of VR/AR, and the tech giant is clearly planning to take small steps at a time. As things stand at present, the framework lacks many of the convenient features of Google’s Project Tango – from the capability of capturing wide-angle scenes with the help of the additional cameras, to full room scanning and create 3D room models without requiring external peripherals (needed in iOS). The framework is not likely to have in-built web search capabilities (as Facebook’s and Google’s AR solutions have) either. What ARKit is expected to do (and do well) is motivating developers to come up with new app ideas with AR as their USP. It does not place any extra pressure on the device CPU, and also offers high-end object scalability. The Apple App Store has more than 2.2 million apps – and if a significant percentage of them have AR features (e.g, the option of activating AR mode), that will be instrumental in helping the technology take off in a big way.

In 2013, Apple coughed up around $345 million to acquire PrimeSense, the 3D sensor company that worked for Microsoft’s Kinect sensor. A couple of years later, the Cupertino company swooped in once again, with the acquisition of Linx (a smart camera motion manufacturer) for $20 million – and Metaio (an AR startup). ARKit might be the first significant augmented reality tool from Apple, but the company has been clearing its way for it from a long time. The arrival of this framework is big news, and it can revolutionize the interactions of iDevice owners with their mobile apps.

 

Core ML

The global artificial intelligence (AI) market has been estimated to touch $48 billion by the end of this decade, growing at a CAGR (2015-2020) of more than 52%. Once again, Apple was relatively quiet on the AI and machine learning (ML) front (apart from the regular improvements in Siri). Rivals like IBM, Google, Facebook and Amazon are already firmly entrenched in this sector, and it will be interesting to see whether Core ML on iOS 11 can put Apple at a position of strength here.

     1. What exactly is Core ML?

Core ML has been created as a foundational framework for creating optimized machine learning services across the board for Apple products. The implication of its arrival is immense for app-makers, who can now blend in superior AI and ML modules in their software. The manual coding required for using Core ML is minimal, and the framework offers in-depth deep learning for more than 30 different layer formats. With the help of Core ML, devs will be able to add custom machine learning capabilities in their upcoming apps for the iOS, tvOS, watchOS and macOS platforms.

   2. From NLP to machine learning

Way back in 2012, natural language processing (NLP) debuted on iOS 5 through NSLinguisticTagger. iOS 8 brought in Metal, a tool that accessed the graphical processing units (GPUs) to deliver enhanced, immersive gaming experiences. In 2016, the Accelerate framework (for processing signals and images) received something new – the Basic Neural Networks for Subroutines (or, BNNS). Since the Core ML framework is designed on top of both Accelerate and Metal, the need to transfer data to a centralized server is eliminated. The framework can function entirely within a device, boosting the security of user-data.

Note: The iPhone 8 might well have a new AI chip. If that happens, it would be perfectly in line with Apple’s attempts to create a space for itself in the machine learning market.

    3. How does Core ML work?

The operations of the Core ML framework can broadly be divided in two stages. In the first stage, machine learning algorithms are applied to available sets of training data (for better results, the size of the training dataset has to be large) – for the creation of a ‘trained model’. The next stage involves the conversion of this ‘trained model’ to a file in a .mlmodel format (i.e., a Core ML Model). High-level AI and ML features can be integrated in iOS applications with the help of this Core ML Model file. The function flow of the new machine learning API can be summarized as: creating ‘trained models’ → transforming them into Core ML models → using them to make ‘intelligent’ predictions.

The Core ML Model contains class labels and all inputs/outputs, and describes the layers used in the framework. The Xcode IDE has the capability of creating Objective-C or Swift wrapper classes (as the case might be), as soon as the model is included in an app project.

    4. Understanding Vision

While ARKit and Core ML were the frameworks that grabbed most of the headlines in WWDC 2017, the arrival of a new computer vision and image analysis framework – appropriately named Vision – has been equally important. Vision would work along with Core ML, and will offer a wide range of feature detection, scene classification and identification features – right from ML-backed picture analysis and face recognition, to text and horizon detection, image alignment, object tracking and barcode detection. The wrappers for the Core ML models are generated by the Vision framework as well. Developers have to, however, keep in mind that Vision will be useful only for models that are image-based.

Note: Just like the other two frameworks, Vision also works with the SDKs of iOS 11, tvOS 11 and macOS 10.13 beta.

  1. Supported models

    The Core ML Model, as should be pretty much evident from our discussions till now, is THE key element of the Core ML framework. Apple offers as many as 5 different, readymade Core ML models for third-party developers to use for creating apps. These models are Places205-GooLeNet, Inception V3, ResNet50, SqueezeNet and VGG16. Since Core ML works within the devices (and not on cloud servers), the overall memory footprints of these models are fairly low. Apart from the default-supported models, the new API is supports quite a few other ML tools (libSVM, XGBoost, Caffe and Keras).

Note: Whether a model is to be run on the GPU or the CPU of the device is decided by the Core ML framework itself. Also, since everything is on-device, the performance of machine learning-based apps is not affected by poor (or unavailability of) network connectivity.

  1. The limitations of Core ML

    There are no doubts about the potential of Core ML as an immensely powerful tool in the hands of developers to seamlessly add efficient machine intelligence for apps that would be usable on all Apple hardware devices. However, much like ARKit, this framework too seems like slightly undercooked on a couple of points. For starters (and this is a big point), Core ML is not open-source – and hence, app-makers have no option to tweak the API for their precise development requirements (most other ML toolkits are open-source). Also, in the absence of ‘federated learning’ and ‘model retraining’ in Core ML, the training data has to be manually provided. The final release of iOS 11 is still some way away, and it remains to be seen whether Apple adds any other capabilities to the framework.

Tree ensembles, neural networks, SVM (support vector machines) and regression (linear/logistic) are some of the models that are supported by Core ML. It is a framework that will make it possible for iOS developers to consider making apps with machine learning as one of their most important features. Core ML has been hailed by Apple as ‘machine learning for everyone’ – and it certainly can bring in machine learning (ML) and deep learning (DL) as an integral part of iOS app development in future.

App Entrepreneur Hussain Fakhruddin Talks About His Role As A Coach At Teksmobile

(This post was originally published as a press release over here)

 

In a recent exclusive interview, noted app evangelist Hussain Fakhruddin – the CEO of Teksmobile (Australia| India| Sweden| USA) – reflected on his role as a coach and a motivator for his colleagues over the years. The startup recently completed 11 years of existence, and as Mr. Fakhruddin highlighted, human resource is the greatest asset that the company has in possession.

The Teks team, with CEO Hussain Fakhruddin

 

Here is the transcription of an excerpt from the interview:

“I like robots, but only in story books and movie screens. When I had conceptualized a tech startup some 11 springs back – one which would have the capacity to challenge the biggest of mobile app companies in the world – I was clear about one thing: I did not want robots in my company. From the very start, Teksmobile has had the good fortune of having diverse and interesting personalities in its fold – and I daresay that they, with a little bit of guidance and coaching from yours truly – have been instrumental in forming the Teks success story.

Take the case, for instance, of the man who performs the dual role of Key Accounts Manager and the Office Admin at my office. A true-blue workaholic, he goes about his job everyday with a smile on his face, and an easy-going confidence that is well and truly infectious. The first thing I had noticed when he came in for interview at Teks was the steely determination in his eyes and a willingness to embrace challenges. Given the range of responsibilities that this post brings with it, I was initially apprehensive about finding an employee who would be good enough for it.This person has met my expectations, and then some.

As an entrepreneur – although I prefer to think of myself more as a head of a work-family – I try to maintain a pleasant demeanour all the time. This is one quality that I share with the person I am talking about. He is always available for a chat, has the patience to listen to every issues and grievances of others, is game for every challenge – from something like sorting out a faulty office wifi router, to managing important corporate documents. For this guy, weekends do not make a difference – and evidently, doing a job – and doing it well – is all that matters. A trusty, intelligent, dedicated sidekick of mine!

Mr. Fakhruddin at MWC2017

Then there is the senior iOS developer, who has been a member of Team Teks for close to a decade now. When in the zone, he talks very little, often has messed up eating times, and has a wacko gameface on…you will never know whether he is trying to debug a piece of code or watching a particularly intense film! He takes every new project as an opportunity – an opportunity to expand his horizons, to learn more, and to prove his tech skills all over again. At times, he seems less like a software engineer, and more like a soldier in a fight.

So this person is all serious almost the entire time he is at office, but can you mark him as just another geek? You absolutely can’t – and the man’s diverse interests caught my attention from the very start. Put a DSLR camera in his hand, and the hidden photography-lover inside him comes to the fore – as he would lovingly check out its buttons and lens and what-not, with a tale or two about the basics of photography ready on his lips. He is also someone the junior developers look up to…someone who can be relied upon to help others, no matter how big a coding-related problem might seem to be. The guy is a big-time lover of travelling too, and of course, has a great many snaps (all taken by himself) from places from all over the world. The most impressive thing about him is the way he constantly tries to learn and improve his skill-sets, while not letting go of things outside work that he is passionate about. If ever I need a person to bounce new ideas from, the guy is my best ally.

Every team has a funnyman, and at Teks, the head of the graphics department dons that role. Meet him outside office, and he is going to regale you with some of the most hilarious PJs and strange stories…often about himself. He is also an expert in leg-pulling, always with a prank or three up his sleeve. The man is not shy of having fun at his own expense – often bringing the house down with his ‘horrific’ (there’s no other word for it!) singing skills.

Team Teks celebrates Independence Day

With this person, it’s a classic ‘Dr-Jekyll-And-Mr.-Hyde story’. When he first joined the Teks team, all that I saw was a bundle of energy, who was just very, very good at his job. All that I had to do is channelise this energy and give him a platform to showcase his skills. ‘Creativity’ is the middle name of this guy – as he continues to add life to graphics and images and animations, for apps and websites, and promotional stuff, and practically everything else. Over the years, I have actually increased his responsibilities gradually – and he has taken to the new tasks like a duck to water. He also doubles up as a mentor to our in-house animators and game developers. A classic example of how you can bring your A-game to work, without ever having to sacrifice your inherent ‘joie-de-vivre’. Inspiring, indeed!

Every human being has their own, discrete personalities and traits – and I keep reiterating the importance of holding on to that, for everyone at Teksmobile. There is that one intern who joined as a part of the content team, who is now one of the senior app testers at office. The nickname for one of our senior PHP developers is ‘slow-motion’ – but while at work, he is among the fastest to get things done, and his eye for quality is outstanding. A couple of Android developers have taken up the challenge of temporarily moving overseas (Germany) and working from there. Then there is the curious case of our digital marketer – who gets totally zoned out when a keyboard is within reach – and often makes others wonder about the reason why he does not even get up to eat! The project managers who are always on the lookout to keep track of tasks, the HR lady who does much more than just schedule interviews and keep track of leaves, the animator who would rank among the biggest movie fanatics ever, the iOS developer who takes on the leadership mantle whenever someone is roasted at office (generally during the monthly birthday parties) – each person brings his/her unique qualities to office, and I feel that it is this diversity that has helped Teksmobile assume its current stature as a market leader.

My company is a great example of the whole unit being greater than the sum of its parts. I feel proud…whenever I see my team of diverse individuals casting their differences aside to work together, bringing more success to the Teks brand in the process. 11 years ago, I had a dream of heading a tech startup – and these people have shared my ambition in their own ways, to give shape to my ambition. I have given them the required guidance and advice whenever required…but at the end of the day, they deserve every bit of recognition for their stellar work for the ‘Teks Family’. I should take this opportunity to thank our partners from Sweden, Australia and the US as well.

Mr. Fakhruddin with clients

At Teks, it is all about showcasing your talents, shaping your own luck, and becoming ‘bigger and better’ than ever before. Far from being robots, my employees are a team of disruptors – steadfastly refusing to follow preset norms, and constantly driving technology forward. And that’s just what I want them to be!”

 

Teksmobile is, at present, one of the top cross-platform mobile app and API development companies worldwide. The company is also working on projects based on cutting-edge technologies like VR/AR, internet of things (IoT), artificial intelligence (AI) and smart agriculture. To know more about the company, visit: http://teks.co.in/site/.