Saturday, February 20, 2021

Using Remote Sensing to Count Trees

Using Remote Sensing to Count Trees

 https://www.gislounge.com/using-remote-sensing-to-count-trees/

 |  | 

Tree count management is important. A systematic tree inventory  assist in decision making. Customary methods for counting trees are labor-intensive catalogue in the field or on an elucidation of large scale aerial photographs. Nevertheless these methods are pricey, time consuming and not pertinent to large, sequestered areas. Remote sensing technology know-how is the operational method for management and monitoring of green resources.

Methods of Extracting Remotely Sensed Data

  1. LiDAR
  2. Satellite Images
  3. UAV/Drone Images
  4. Terrestrial Photogrammetry

LiDAR

LiDAR methods of data collection is progressively used in forestry applications but also employed in urban environments for green cover calculations, tree canopy mapping and tree counting.  Vast point clouds are usually converted software specific readable formats and are used to do the mapping for the tree counting and urban forestry mapping.

Satellite Images

Remotely sensed high-resolution or very high resolution satellite image data are crucial in this management, since it provides detailed information to administrators and planners for better decision making.

UAV/Drone Images

Hyperspectral remote sensing, which uses the modern satellite sensors ability to capture the data in multiple-bands, in amalgamation with a properly updated land information system is understood to be a worthy technique to assist in making fast decisions. The practice of using Unmanned Aerial Vehicle (UAV) platform for many remote sensing applications is done to combine the advantages of traditional remote sensing techniques and the inexpensiveness of operating such techniques. UAV drones can fly at varying altitudes subject to the objective of the mission and end-result type. This tractability allows for optimization of the procedures according the meteorological conditions over a given area and the user requirements.

Terrestrial Photogrammetry

Regardless of the factor that satellite and aerial images have been widely used to distinguish, demarcate and count individual tree in urban areas and forested lands, till such techniques becomes widely accessible and knowledge of processing such data is increased, the traditional methods still hold the sway and might be detrimental  for the green cover we all wish to have.



About the Author

Anil Narendran Pillai – (Vice President – Geomatics @ SBL) Mr. Pillai heads the GSS (Geospatial Services) domain at SBL. He has worked in the digital mapping, remote sensing, and GIS industries for over 23 years. He has 23+ years experience managing and coordinating GIS projects and 12 years senior management experience. He has extensive experience in all aspects of aerial and satellite imaging technology and applications. He has utilized remotely sensed satellite and airborne imagery for a variety of environmental applications including site location analysis, forestry, telecommunications and utility corridor mapping. He has a strong background in management of GIS and Photogrammetry imaging projects to support Government and private industry needs. His Passion lies in Need Analysis and Documentation, Topographical Mapping (ArcGIS), Spatial Data Management, Integrity and Security, GIS Data transformations and projections from multiple sources, Image Processing Software user testing and documentation, Project Coordination and Tech. Support, Inter-agency communication and support, 3D Data Generation and Management, Project Management, Digital Photogrammetry, Satellite Image Processing, Pre-Sales Presentations.

See more about SBL Geospatial services http://www.sblcorp.com/geospatial-services


Using Near-Infrared Aerial Imagery to Map Oak Trees




A pilot project was developed that involved the identification of oak trees by the City’s arborist for a single aerial tile.  Staff at Engineering Systems then used those marked locations to create a polygon layer in AutoCAD of all oak trees present.  Engineering Systems then developed an application using Microsoft .NET that scanned the TIFF image (a 000 x 8000 4-inch pixel grid comprising 64 million pixels).  Those pixel within the polygons were extract and were analyzed to prepare histograms to represent the frequency of individual Red, Green and Blue (RGB) values to determine the peak values representing the spectral signature of the oak trees.

Three histograms representing frequencies of Red, Green, and Blue values.

The spectral signature was then used as input parameters for a second application that scanned the TIFF image tile and extracted pixels that matched the signature.  The process went through several iterations matched against field surveys to verify that the correct species of trees were being selected.  The overall analysis found that the spectral signature was accurate in identifying more mature oak trees but younger trees with smaller canopies were not being identified.  Engineering Systems is working on refining the process to be able to identify those younger trees.  Over 166,000 trees were located using this automated process.

The resulting geographic layer also identifies the diameter of the oak tree canopy.  Since the oak tree locations are now georeferenced, the oak trees were spatially identified with the parcel number.  This now allows the staff within the various city departments to know when a property has an oak tree and is subject to the constraints of the Oak Tree Ordinance when plans are submitted by developers and property owners.





Thursday, August 6, 2020

PYTHON CODES

PYTHON CODES
 
  •  A statement or expression is an instruction the computer will run or execute.
  • The value in the parentheses is called the argument.




  • A Semantic error is when your logic is wrong.

    print("Hello\nWorld!")

    Hello
    World!



    Expressions describe a type of operation that computers perform.








    1. We can bind a string to another variable.
    2. It is helpful to think of string as a list or tuple.

    `We can treat the string as a sequence and perform sequence operations.


    We can also input a stride value as follows. The 2 indicates we select every second variable.

















    1. we set the variable A to the following value.
    2. We apply the method "upper" and set it equal to "B“.






    1. The method find, finds sub-strings. The argument is the sub-string you would like
    2. to find. The output is the first index of the sequence.
    3. We can find the sub-string Jack. If the sub-string is not in the string, the
    4. output is negative one.

    MODULE 2


    lists and tuples, these are called compound data types
    Tuples Tuples are an ordered sequence.






















    Saturday, August 1, 2020

    Sort data to analyze it

    Managing Your Data

    What is a database?

    Almost every data scientist will spend time working in a database, which is an organized collection of structured data in a computer system. (Remember, structured data is usually organized in a table format with rows and columns, like the following example.)

    Last four digits of social security numberLast nameAge
    6881Marshall23
    0121Rodriguez19
    5538Cho59
    2972Parker33
    3154Sawyer72
    Most databases today are organized as relational databases, which are collections of multiple data sets or tables that link together.
    While SQL is the underlying language that drives most work done in relational databases, there are many RDBMSs in which you can do that work. As you venture into this field, you’ll run into names like these:
    • MySQL
    • Microsoft Access
    • PostgreSQL
    • Oracle
    • IBM DB2
    • MongoDB

    Choose the right tools to manage data

    Where do you begin? There are dozens of useful data science tools and platforms! Here’s a list of some popular and open source platforms that you can use to begin your own data science journey.

    R is a good place to start
    R is a programming language and free software environment often used for statistical analysis and data science. Many would-be data scientists start with this tool or with one of the popular R interfaces, and there are hundreds of useful packages in R that help with data visualization such as ggplot2.
    Python works for general purposes
    Python is a popular, general-purpose programming language that can also be used for data science. Pair it with a library like pandas library and with a useful interface, and Python can help you create new insights and data visualizations.
    MATLAB helps crunch numbers
    MATLAB was built to focus on numerical computing. It is often used in higher education.
    Apache Spark supports big data and machine learning
    Apache Spark is a proprietary general-purpose framework that can be especially useful for extremely large data sets and the machine learning that uses them.

    Tuesday, July 14, 2020

    CODES

    CODES

    Data Cleaning and Blending



    The NYT dataset doesn’t include information about county population, so I’m going to merge the two datasets into one using Python and pd.merge().
    merge in Pandas
    by using a SQL JOIN or 
    a VLOOKUP in Excel


    Before we do that, we first need to clean the data 

    check the Github repo here.

    Monday, July 13, 2020

    Support Vector Machines

    Support Vector Machines, Clearly Explained!!!



    Plot an ROC Curve in Python

    How to Plot an ROC Curve in Python | Machine Learning in Python




    OC and AUC, Clearly Explained!


    ROC and AUC in R



    machine learning and data science on cloud

    run machine learning and data science on cloud using high processing GPUs at no COST




    https://colab.research.google.com/





    Google Colaboratory supports Python version 2.7 and 3.6
    I see an example how to use Swift in Colab a while ago


    # Please try the newer version here:

    https://colab.research.google.com/drive/1BYnnbqeyZAlYnxR9IHC8tpW07EpDeyKR

    or Kaggle R jupyter notebook which supports R and Rstan by default:

    https://www.kaggle.com/thimac/rstan?scriptVersionId=20867095


    How to use R and Python in same notebook on Google Colab





    How to Build Your First Data Science Web App in Python (Streamlit Tutorial Part 1)





    Sunday, July 12, 2020

    Event Studies.py

    EventStudies.py




    I do have a project going on and wanted to check if you can help me with any of my challenges.

    Project: Comparative Analysis on stock returns around M&A announcements.
    I need an expert / consultant in Data Analysis and Econometrics with access to professional databases and knowledge possibly in Python, R, Stata or similar tools to conduct an event study with multiple event windows done on a large dataset.
    Vision:
    I have identified a list of ~17k transactions (the sample) that fulfil the selection criteria. I intend to conduct an event study in order to find abnormal returns for acquirer, target and both combined. The results shall then be presented.
    My challenges:
    - Reducing the sample? Yes or No?
    - Identifying the right indices or basket of comparable (industry, geography, liquid) stock as a proxy of the market portfolio to regress against.
    - Sourcing the data for the large amount of transactions (acquirer, target, market portfolio.
    - Cleaning the data and making sure that for each transaction there are an equal amount of observations.
    - Estimate normal returns based on respective market portfolio chosen for the specific event
    - Cumulative Abnormal returns for all Acquirers, Targets and both combined (Whole Sample)
    - Testing for significance
    - Dividing the data into two cohorts based on one simple selection criteria
    - Cumulative Abnormal returns for all Acquirers, Targets and both combined (Cohort 1 & Cohort 2)
    - Testing for significance
    Requirements:
    • The candidate must have proven knowledge and understanding of conducting event studies, data anlysis and econometrics.
    • The candidate should have a good understanding of the academic literature surrounding event studies.
    • The candidate must have access to professional databases like Bloomberg, Datastream, CapitalIQ or comparable.
    Expectations:
    - A solution that requires as little manual intervention as possible and can be reused with a different data set.
    - Support in word and deed and act as a consultant.
    - Model documentation and validation including relevant tables and graphs and descriptive statistics
    - Source code, if any
    Specifications:
    Budget: 250 doller, Delivery time: 7 days (Jul. 16 2020)
    Tool: Python

    PYTHON SHELL