sin

25 Best Freelance Tools to Enhance Your Business for Free

Freelancing can be a tough gig, but there is no better time than a new year to begin building (or rebuilding) a fantastic new business where you can do what you love. Being successful has a lot to do with your drive and passion, but depends largely on your efficiency, workflow and presentation. In this […]


The post 25 Best Freelance Tools to Enhance Your Business for Free appeared first on Web Designer Wall.




sin

Easy CSS Animation Using @keyframes

CSS Transitions and transforms work beautifully for creating visual interactions based on single state changes. To have more control over what happens and when, you can use the CSS animation property to create easy CSS animation using @keyframes. This technique has a wide range of design application and can be used to build dazzling pre-loaders, […]


The post Easy CSS Animation Using @keyframes appeared first on Web Designer Wall.




sin

How Important Is A Domain Name For Your Business?

Online representation has a crucial role in planning a business. Today, people turn to the internet whenever they need help, but especially when they want to find certain products or specific...




sin

The Best Way To Improve Your Business Skills

Are you thinking about running a business after getting a degree? This article will help you find the best ways to make your business skills more efficient and useful. The Best Way To Improve Your...




sin

How A Web Design Business Can Benefit From Using Accounting Applications

Accounting applications help web design businesses in many ways. As a web design service provider, you should use them to boost your business. Start by browsing some resources online that provide...




sin

5 Tips That You Absolutely Must Know To Design A Unique Metal Business Card

Every day thousands of business cards exchange hands, and these business cards often get lost in mounds of other cards. Often, clients are unable to reach you just because they couldn't find your...



  • Design Roud-up

sin

Why Choosing The Best Web Hosting Is Crucial For Your Business

Not many business owners think about hosting when building a new website for their business. But failing to choose the right web hosting can have a great impact on your website and, of course, your...





sin

TrailBuddy: Using AI to Create a Predictive Trail Conditions App

Viget is full of outdoor enthusiasts and, of course, technologists. For this year's Pointless Weekend, we brought these passions together to build TrailBuddy. This app aims to solve that eternal question: Is my favorite trail dry so I can go hike/run/ride?

While getting muddy might rekindle fond childhood memories for some, exposing your gear to the elements isn’t great – it’s bad for your equipment and can cause long-term, and potentially expensive, damage to the trail.

There are some trail apps out there but we wanted one that would focus on current conditions. Currently, our favorites trail apps, like mtbproject.com, trailrunproject.com, and hikingproject.com -- all owned by REI, rely on user-reported conditions. While this can be effective, the reports are frequently unreliable, as condition reports can become outdated in just a few days.

Our goal was to solve this problem by building an app that brought together location, soil type, and weather history data to create on-demand condition predictions for any trail in the US.

We built an initial version of TrailBuddy by tapping into several readily-available APIs, then running the combined data through a machine learning algorithm. (Oh, and also by bringing together a bunch of smart and motivated people and combining them with pizza and some of the magic that is our Pointless Weekends. We'll share the other Pointless Project, Scurry, with you soon.)

The quest for data.

We knew from the start this app would require data from a number of sources. As previously mentioned, we used REI’s APIs (i.e. https://www.hikingproject.com/data) as the source for basic trail information. We used the trails’ latitude and longitude coordinates as well as its elevation to query weather and soil type. We also found data points such as a trail’s total distance to be relevant to our app users and decided to include that on the front-end, too. Since we wanted to go beyond relying solely on user-reported metrics, which is how REI’s current MTB project works, we came up with a list of factors that could affect the trail for that day.

First on that list was weather.

We not only considered the impacts of the current forecast, but we also looked at the previous day’s forecast. For example, it’s safe to assume that if it’s currently raining or had been raining over the last several days, it would likely lead to muddy and unfavorable conditions for that trail. We utilized the DarkSky API (https://darksky.net/dev) to get the weather forecasts for that day, as well as the records for previous days. This included expected information, like temperature and precipitation chance. It also included some interesting data points that we realized may be factors, like precipitation intensity, cloud cover, and UV index. 

But weather alone can’t predict how muddy or dry a trail will be. To determine that for sure, we also wanted to use soil data to help predict how well a trail’s unique soil composition recovers after precipitation. Similar amounts of rain on trails of very different soil types could lead to vastly different trail conditions. A more clay-based soil would hold water much longer, and therefore be much more unfavorable, than loamy soil. Finding a reliable source for soil type and soil drainage proved incredibly difficult. After many hours, we finally found a source through the USDA that we could use. As a side note—the USDA keeps track of lots of data points on soil information that’s actually pretty interesting! We can’t say we’re soil experts but, we felt like we got pretty close.

We used Whimsical to build our initial wireframes.

Putting our design hats on.

From the very first pitch for this app, TrailBuddy’s main differentiator to peer trail resources is its ability to surface real-time information, reliably, and simply. For as complicated as the technology needed to collect and interpret information, the front-end app design needed to be clean and unencumbered.

We thought about how users would naturally look for information when setting out to find a trail and what factors they’d think about when doing so. We posed questions like:

  • How easy or difficult of a trail are they looking for?
  • How long is this trail?
  • What does the trail look like?
  • How far away is the trail in relation to my location?
  • For what activity am I needing a trail for?
  • Is this a trail I’d want to come back to in the future?

By putting ourselves in our users’ shoes we quickly identified key features TrailBuddy needed to have to be relevant and useful. First, we needed filtering, so users could filter between difficulty and distance to narrow down their results to fit the activity level. Next, we needed a way to look up trails by activity type—mountain biking, hiking, and running are all types of activities REI’s MTB API tracks already so those made sense as a starting point. And lastly, we needed a way for the app to find trails based on your location; or at the very least the ability to find a trail within a certain distance of your current location.

We used Figma to design, prototype, and gather feedback on TrailBuddy.

Using machine learning to predict trail conditions.

As stated earlier, none of us are actual soil or data scientists. So, in order to achieve the real-time conditions reporting TrailBuddy promised, we’d decided to leverage machine learning to make predictions for us. Digging into the utility of machine learning was a first for all of us on this team. Luckily, there was an excellent tutorial that laid out the basics of building an ML model in Python. Provided a CSV file with inputs in the left columns, and the desired output on the right, the script we generated was able to test out multiple different model strategies, and output the effectiveness of each in predicting results, shown below.

We assembled all of the historical weather and soil data we could find for a given latitude/longitude coordinate, compiled a 1000 * 100 sized CSV, ran it through the Python evaluator, and found that the CART and SVM models consistently outranked the others in terms of predicting trail status. In other words, we found a working model for which to run our data through and get (hopefully) reliable predictions from. The next step was to figure out which data fields were actually critical in predicting the trail status. The more we could refine our data set, the faster and smarter our predictive model could become.

We pulled in some Ruby code to take the original (and quite massive) CSV, and output smaller versions to test with. Now again, we’re no data scientists here but, we were able to cull out a good majority of the data and still get a model that performed at 95% accuracy.

With our trained model in hand, we could serialize that to into a model.pkl file (pkl stands for “pickle”, as in we’ve “pickled” the model), move that file into our Rails app along with it a python script to deserialize it, pass in a dynamic set of data, and generate real-time predictions. At the end of the day, our model has a propensity to predict fantastic trail conditions (about 99% of the time in fact…). Just one of those optimistic machine learning models we guess.

Where we go from here.

It was clear that after two days, our team still wanted to do more. As a first refinement, we’d love to work more with our data set and ML model. Something that was quite surprising during the weekend was that we found we could remove all but two days worth of weather data, and all of the soil data we worked so hard to dig up, and still hit 95% accuracy. Which … doesn’t make a ton of sense. Perhaps the data we chose to predict trail conditions just isn’t a great empirical predictor of trail status. While these are questions too big to solve in just a single weekend, we'd love to spend more time digging into this in a future iteration.



  • News & Culture

sin

Here comes Traversty traversing the DOM

The Traversty DOM utility has as its purpose to allow you to traverse the DOM and manage collections of DOM elements. Proponents admit core Traversty traversal methods are inspired by Prototype’s DOM Traversal toolkit, but now in a multi-element environment that is more like jQuery and less like Prototype’s single element implementation.







sin

Humanity ‘Sleepwalking Towards the Edge of a Cliff’: 60% of Earth’s Wildlife Wiped Out Since 1970

By Julia Conley Common Dreams “Nature is not a ‘nice to have’—it is our life-support system.” Scientists from around the world issued a stark warning to humanity Tuesday in a semi-annual report on the Earth’s declining biodiversity, which shows that … Continue reading




sin

Human Activity Increasing Rate of Record-Breaking Hot Years

American Geophysical Union (AGU) Press Release A new study finds human-caused global warming is significantly increasing the rate at which hot temperature records are being broken around the world. Global annual temperature records show there were 17 record hot years … Continue reading




sin

Humanity ‘Sleepwalking Towards the Edge of a Cliff’: 60% of Earth’s Wildlife Wiped Out Since 1970

By Julia Conley Common Dreams “Nature is not a ‘nice to have’—it is our life-support system.” Scientists from around the world issued a stark warning to humanity Tuesday in a semi-annual report on the Earth’s declining biodiversity, which shows that … Continue reading




sin

Using Funds from Disability Compensation and the GI Bill for Going Back to School

Receiving service-related disability compensation does not interfere with the funds veterans receive from the GI Bill, explains Adam.




sin

Using Communities to Further the True Meaning of Resiliency

Service members, veterans, and their caregivers are incredibly resilient, says Adam, but learning to connect with whatever community you are in will only strengthen that resiliency.




sin

Inform user about automatic comment closing time

To prevent spammers from flooding old articles with useless comments you can set WordPress to close comments after a certain […]




sin

Can Business Save the World From Climate Change?

By Bianca Nogrady Ensia A growing number of initiatives are giving corporations the resources to help achieve global climate goals regardless of government support “We are still in.” On June 5, 2017, with these four words a group of U.S. … Continue reading




sin

Self Reliance + Personal Uprising with John Jantsch

John Jantsch is a veteran marketer. He’s written several bestselling books including Duct Tape Marketing and The Referral Engine. He’s out with a new book called the Self-Reliant Entrepreneur: 366 Daily Meditations to Feed Your Soul and Grow Your Business  As you might know, I’m a bit of a fan of daily habits, so of course John gives us a little preview into some of the daily explorations of thoughts and writings from notable American authors. Of course, that’s not all…  we also get into: We go deep into following your own path and listening to your intuition. What we can learn from rabble rousers of our history and those who embraced counter culture to follow their own beliefs. The role that self-awareness has in pursuing our dreams. and much more. Enjoy! FOLLOW JOHN: facebook | twitter | website Listen to the Podcast Subscribe   This podcast is brought to you by CreativeLive. CreativeLive is the world’s largest hub for online creative education in photo/video, art/design, music/audio, craft/maker, money/life and the ability to make a living in any of those disciplines. They are high quality, highly curated classes taught by the world’s top experts — Pulitzer, Oscar, Grammy Award winners, New […]

The post Self Reliance + Personal Uprising with John Jantsch appeared first on Chase Jarvis Photography.




sin

TrailBuddy: Using AI to Create a Predictive Trail Conditions App

Viget is full of outdoor enthusiasts and, of course, technologists. For this year's Pointless Weekend, we brought these passions together to build TrailBuddy. This app aims to solve that eternal question: Is my favorite trail dry so I can go hike/run/ride?

While getting muddy might rekindle fond childhood memories for some, exposing your gear to the elements isn’t great – it’s bad for your equipment and can cause long-term, and potentially expensive, damage to the trail.

There are some trail apps out there but we wanted one that would focus on current conditions. Currently, our favorites trail apps, like mtbproject.com, trailrunproject.com, and hikingproject.com -- all owned by REI, rely on user-reported conditions. While this can be effective, the reports are frequently unreliable, as condition reports can become outdated in just a few days.

Our goal was to solve this problem by building an app that brought together location, soil type, and weather history data to create on-demand condition predictions for any trail in the US.

We built an initial version of TrailBuddy by tapping into several readily-available APIs, then running the combined data through a machine learning algorithm. (Oh, and also by bringing together a bunch of smart and motivated people and combining them with pizza and some of the magic that is our Pointless Weekends. We'll share the other Pointless Project, Scurry, with you soon.)

The quest for data.

We knew from the start this app would require data from a number of sources. As previously mentioned, we used REI’s APIs (i.e. https://www.hikingproject.com/data) as the source for basic trail information. We used the trails’ latitude and longitude coordinates as well as its elevation to query weather and soil type. We also found data points such as a trail’s total distance to be relevant to our app users and decided to include that on the front-end, too. Since we wanted to go beyond relying solely on user-reported metrics, which is how REI’s current MTB project works, we came up with a list of factors that could affect the trail for that day.

First on that list was weather.

We not only considered the impacts of the current forecast, but we also looked at the previous day’s forecast. For example, it’s safe to assume that if it’s currently raining or had been raining over the last several days, it would likely lead to muddy and unfavorable conditions for that trail. We utilized the DarkSky API (https://darksky.net/dev) to get the weather forecasts for that day, as well as the records for previous days. This included expected information, like temperature and precipitation chance. It also included some interesting data points that we realized may be factors, like precipitation intensity, cloud cover, and UV index. 

But weather alone can’t predict how muddy or dry a trail will be. To determine that for sure, we also wanted to use soil data to help predict how well a trail’s unique soil composition recovers after precipitation. Similar amounts of rain on trails of very different soil types could lead to vastly different trail conditions. A more clay-based soil would hold water much longer, and therefore be much more unfavorable, than loamy soil. Finding a reliable source for soil type and soil drainage proved incredibly difficult. After many hours, we finally found a source through the USDA that we could use. As a side note—the USDA keeps track of lots of data points on soil information that’s actually pretty interesting! We can’t say we’re soil experts but, we felt like we got pretty close.

We used Whimsical to build our initial wireframes.

Putting our design hats on.

From the very first pitch for this app, TrailBuddy’s main differentiator to peer trail resources is its ability to surface real-time information, reliably, and simply. For as complicated as the technology needed to collect and interpret information, the front-end app design needed to be clean and unencumbered.

We thought about how users would naturally look for information when setting out to find a trail and what factors they’d think about when doing so. We posed questions like:

  • How easy or difficult of a trail are they looking for?
  • How long is this trail?
  • What does the trail look like?
  • How far away is the trail in relation to my location?
  • For what activity am I needing a trail for?
  • Is this a trail I’d want to come back to in the future?

By putting ourselves in our users’ shoes we quickly identified key features TrailBuddy needed to have to be relevant and useful. First, we needed filtering, so users could filter between difficulty and distance to narrow down their results to fit the activity level. Next, we needed a way to look up trails by activity type—mountain biking, hiking, and running are all types of activities REI’s MTB API tracks already so those made sense as a starting point. And lastly, we needed a way for the app to find trails based on your location; or at the very least the ability to find a trail within a certain distance of your current location.

We used Figma to design, prototype, and gather feedback on TrailBuddy.

Using machine learning to predict trail conditions.

As stated earlier, none of us are actual soil or data scientists. So, in order to achieve the real-time conditions reporting TrailBuddy promised, we’d decided to leverage machine learning to make predictions for us. Digging into the utility of machine learning was a first for all of us on this team. Luckily, there was an excellent tutorial that laid out the basics of building an ML model in Python. Provided a CSV file with inputs in the left columns, and the desired output on the right, the script we generated was able to test out multiple different model strategies, and output the effectiveness of each in predicting results, shown below.

We assembled all of the historical weather and soil data we could find for a given latitude/longitude coordinate, compiled a 1000 * 100 sized CSV, ran it through the Python evaluator, and found that the CART and SVM models consistently outranked the others in terms of predicting trail status. In other words, we found a working model for which to run our data through and get (hopefully) reliable predictions from. The next step was to figure out which data fields were actually critical in predicting the trail status. The more we could refine our data set, the faster and smarter our predictive model could become.

We pulled in some Ruby code to take the original (and quite massive) CSV, and output smaller versions to test with. Now again, we’re no data scientists here but, we were able to cull out a good majority of the data and still get a model that performed at 95% accuracy.

With our trained model in hand, we could serialize that to into a model.pkl file (pkl stands for “pickle”, as in we’ve “pickled” the model), move that file into our Rails app along with it a python script to deserialize it, pass in a dynamic set of data, and generate real-time predictions. At the end of the day, our model has a propensity to predict fantastic trail conditions (about 99% of the time in fact…). Just one of those optimistic machine learning models we guess.

Where we go from here.

It was clear that after two days, our team still wanted to do more. As a first refinement, we’d love to work more with our data set and ML model. Something that was quite surprising during the weekend was that we found we could remove all but two days worth of weather data, and all of the soil data we worked so hard to dig up, and still hit 95% accuracy. Which … doesn’t make a ton of sense. Perhaps the data we chose to predict trail conditions just isn’t a great empirical predictor of trail status. While these are questions too big to solve in just a single weekend, we'd love to spend more time digging into this in a future iteration.



  • News & Culture

sin

Implementing Dark Mode In React Apps Using styled-components

One of the most commonly requested software features is dark mode (or night mode, as others call it). We see dark mode in the apps that we use every day. From mobile to web apps, dark mode has become vital for companies that want to take care of their users’ eyes. Dark mode is a supplemental feature that displays mostly dark surfaces in the UI. Most major companies (such as YouTube, Twitter, and Netflix) have adopted dark mode in their mobile and web apps.




sin

How To Build A Vue Survey App Using Firebase Authentication And Database

In this tutorial, you’ll be building a Survey App, where we’ll learn to validate our users form data, implement Authentication in Vue, and be able to receive survey data using Vue and Firebase (a BaaS platform). As we build this app, we’ll be learning how to handle form validation for different kinds of data, including reaching out to the backend to check if an email is already taken, even before the user submits the form during sign up.




sin

Nonlinear singular problems with indefinite potential term. (arXiv:2005.01789v3 [math.AP] UPDATED)

We consider a nonlinear Dirichlet problem driven by a nonhomogeneous differential operator plus an indefinite potential. In the reaction we have the competing effects of a singular term and of concave and convex nonlinearities. In this paper the concave term is parametric. We prove a bifurcation-type theorem describing the changes in the set of positive solutions as the positive parameter $lambda$ varies. This work continues our research published in arXiv:2004.12583, where $xi equiv 0 $ and in the reaction the parametric term is the singular one.




sin

Solving an inverse problem for the Sturm-Liouville operator with a singular potential by Yurko's method. (arXiv:2004.14721v2 [math.SP] UPDATED)

An inverse spectral problem for the Sturm-Liouville operator with a singular potential from the class $W_2^{-1}$ is solved by the method of spectral mappings. We prove the uniqueness theorem, develop a constructive algorithm for solution, and obtain necessary and sufficient conditions of solvability for the inverse problem in the self-adjoint and the non-self-adjoint cases




sin

Local mollification of Riemannian metrics using Ricci flow, and Ricci limit spaces. (arXiv:1706.09490v2 [math.DG] UPDATED)

We use Ricci flow to obtain a local bi-Holder correspondence between Ricci limit spaces in three dimensions and smooth manifolds. This is more than a complete resolution of the three-dimensional case of the conjecture of Anderson-Cheeger-Colding-Tian, describing how Ricci limit spaces in three dimensions must be homeomorphic to manifolds, and we obtain this in the most general, locally non-collapsed case. The proofs build on results and ideas from recent papers of Hochard and the current authors.




sin

Positive Geometries and Differential Forms with Non-Logarithmic Singularities I. (arXiv:2005.03612v1 [hep-th])

Positive geometries encode the physics of scattering amplitudes in flat space-time and the wavefunction of the universe in cosmology for a large class of models. Their unique canonical forms, providing such quantum mechanical observables, are characterised by having only logarithmic singularities along all the boundaries of the positive geometry. However, physical observables have logarithmic singularities just for a subset of theories. Thus, it becomes crucial to understand whether a similar paradigm can underlie their structure in more general cases. In this paper we start a systematic investigation of a geometric-combinatorial characterisation of differential forms with non-logarithmic singularities, focusing on projective polytopes and related meromorphic forms with multiple poles. We introduce the notions of covariant forms and covariant pairings. Covariant forms have poles only along the boundaries of the given polytope; moreover, their leading Laurent coefficients along any of the boundaries are still covariant forms on the specific boundary. Whereas meromorphic forms in covariant pairing with a polytope are associated to a specific (signed) triangulation, in which poles on spurious boundaries do not cancel completely, but their order is lowered. These meromorphic forms can be fully characterised if the polytope they are associated to is viewed as the restriction of a higher dimensional one onto a hyperplane. The canonical form of the latter can be mapped into a covariant form or a form in covariant pairing via a covariant restriction. We show how the geometry of the higher dimensional polytope determines the structure of these differential forms. Finally, we discuss how these notions are related to Jeffrey-Kirwan residues and cosmological polytopes.




sin

The Fourier Transform Approach to Inversion of lambda-Cosine and Funk Transforms on the Unit Sphere. (arXiv:2005.03607v1 [math.FA])

We use the classical Fourier analysis to introduce analytic families of weighted differential operators on the unit sphere. These operators are polynomial functions of the usual Beltrami-Laplace operator. New inversion formulas are obtained for totally geodesic Funk transforms on the sphere and the correpsonding lambda-cosine transforms.




sin

Toric Sasaki-Einstein metrics with conical singularities. (arXiv:2005.03502v1 [math.DG])

We show that any toric K"ahler cone with smooth compact cross-section admits a family of Calabi-Yau cone metrics with conical singularities along its toric divisors. The family is parametrized by the Reeb cone and the angles are given explicitly in terms of the Reeb vector field. The result is optimal, in the sense that any toric Calabi-Yau cone metric with conical singularities along the toric divisor (and smooth elsewhere) belongs to this family. We also provide examples and interpret our results in terms of Sasaki-Einstein metrics.




sin

Removable singularities for Lipschitz caloric functions in time varying domains. (arXiv:2005.03397v1 [math.CA])

In this paper we study removable singularities for regular $(1,1/2)$-Lipschitz solutions of the heat equation in time varying domains. We introduce an associated Lipschitz caloric capacity and we study its metric and geometric properties and the connection with the $L^2$ boundedness of the singular integral whose kernel is given by the gradient of the fundamental solution of the heat equation.




sin

Semiglobal non-oscillatory big bang singular spacetimes for the Einstein-scalar field system. (arXiv:2005.03395v1 [math-ph])

We construct semiglobal singular spacetimes for the Einstein equations coupled to a massless scalar field. Consistent with the heuristic analysis of Belinskii, Khalatnikov, Lifshitz or BKL for this system, there are no oscillations due to the scalar field. (This is much simpler than the oscillatory BKL heuristics for the Einstein vacuum equations.) Prior results are due to Andersson and Rendall in the real analytic case, and Rodnianski and Speck in the smooth near-spatially-flat-FLRW case. Similar to Andersson and Rendall we give asymptotic data at the singularity, which we refer to as final data, but our construction is not limited to real analytic solutions. This paper is a test application of tools (a graded Lie algebra formulation of the Einstein equations and a filtration) intended for the more subtle vacuum case. We use homological algebra tools to construct a formal series solution, then symmetric hyperbolic energy estimates to construct a true solution well-approximated by truncations of the formal one. We conjecture that the image of the map from final data to initial data is an open set of anisotropic initial data.




sin

Converging outer approximations to global attractors using semidefinite programming. (arXiv:2005.03346v1 [math.OC])

This paper develops a method for obtaining guaranteed outer approximations for global attractors of continuous and discrete time nonlinear dynamical systems. The method is based on a hierarchy of semidefinite programming problems of increasing size with guaranteed convergence to the global attractor. The approach taken follows an established line of reasoning, where we first characterize the global attractor via an infinite dimensional linear programming problem (LP) in the space of Borel measures. The dual to this LP is in the space of continuous functions and its feasible solutions provide guaranteed outer approximations to the global attractor. For systems with polynomial dynamics, a hierarchy of finite-dimensional sum-of-squares tightenings of the dual LP provides a sequence of outer approximations to the global attractor with guaranteed convergence in the sense of volume discrepancy tending to zero. The method is very simple to use and based purely on convex optimization. Numerical examples with the code available online demonstrate the method.




sin

Homotopy invariance of the space of metrics with positive scalar curvature on manifolds with singularities. (arXiv:2005.03073v1 [math.AT])

In this paper we study manifolds $M_{Sigma}$ with fibered singularities, more specifically, a relevant space $Riem^{psc}(X_{Sigma})$ of Riemannian metrics with positive scalar curvature. Our main goal is to prove that the space $Riem^{psc}(X_{Sigma})$ is homotopy invariant under certain surgeries on $M_{Sigma}$.




sin

Modeling nanoconfinement effects using active learning. (arXiv:2005.02587v2 [physics.app-ph] UPDATED)

Predicting the spatial configuration of gas molecules in nanopores of shale formations is crucial for fluid flow forecasting and hydrocarbon reserves estimation. The key challenge in these tight formations is that the majority of the pore sizes are less than 50 nm. At this scale, the fluid properties are affected by nanoconfinement effects due to the increased fluid-solid interactions. For instance, gas adsorption to the pore walls could account for up to 85% of the total hydrocarbon volume in a tight reservoir. Although there are analytical solutions that describe this phenomenon for simple geometries, they are not suitable for describing realistic pores, where surface roughness and geometric anisotropy play important roles. To describe these, molecular dynamics (MD) simulations are used since they consider fluid-solid and fluid-fluid interactions at the molecular level. However, MD simulations are computationally expensive, and are not able to simulate scales larger than a few connected nanopores. We present a method for building and training physics-based deep learning surrogate models to carry out fast and accurate predictions of molecular configurations of gas inside nanopores. Since training deep learning models requires extensive databases that are computationally expensive to create, we employ active learning (AL). AL reduces the overhead of creating comprehensive sets of high-fidelity data by determining where the model uncertainty is greatest, and running simulations on the fly to minimize it. The proposed workflow enables nanoconfinement effects to be rigorously considered at the mesoscale where complex connected sets of nanopores control key applications such as hydrocarbon recovery and CO2 sequestration.




sin

Temporal Event Segmentation using Attention-based Perceptual Prediction Model for Continual Learning. (arXiv:2005.02463v2 [cs.CV] UPDATED)

Temporal event segmentation of a long video into coherent events requires a high level understanding of activities' temporal features. The event segmentation problem has been tackled by researchers in an offline training scheme, either by providing full, or weak, supervision through manually annotated labels or by self-supervised epoch based training. In this work, we present a continual learning perceptual prediction framework (influenced by cognitive psychology) capable of temporal event segmentation through understanding of the underlying representation of objects within individual frames. Our framework also outputs attention maps which effectively localize and track events-causing objects in each frame. The model is tested on a wildlife monitoring dataset in a continual training manner resulting in $80\%$ recall rate at $20\%$ false positive rate for frame level segmentation. Activity level testing has yielded $80\%$ activity recall rate for one false activity detection every 50 minutes.




sin

Prediction of Event Related Potential Speller Performance Using Resting-State EEG. (arXiv:2005.01325v3 [cs.HC] UPDATED)

Event-related potential (ERP) speller can be utilized in device control and communication for locked-in or severely injured patients. However, problems such as inter-subject performance instability and ERP-illiteracy are still unresolved. Therefore, it is necessary to predict classification performance before performing an ERP speller in order to use it efficiently. In this study, we investigated the correlations with ERP speller performance using a resting-state before an ERP speller. In specific, we used spectral power and functional connectivity according to four brain regions and five frequency bands. As a result, the delta power in the frontal region and functional connectivity in the delta, alpha, gamma bands are significantly correlated with the ERP speller performance. Also, we predicted the ERP speller performance using EEG features in the resting-state. These findings may contribute to investigating the ERP-illiteracy and considering the appropriate alternatives for each user.




sin

SPECTER: Document-level Representation Learning using Citation-informed Transformers. (arXiv:2004.07180v3 [cs.CL] UPDATED)

Representation learning is a critical ingredient for natural language processing systems. Recent Transformer language models like BERT learn powerful textual representations, but these models are targeted towards token- and sentence-level training objectives and do not leverage information on inter-document relatedness, which limits their document-level representation power. For applications on scientific documents, such as classification and recommendation, the embeddings power strong performance on end tasks. We propose SPECTER, a new method to generate document-level embedding of scientific documents based on pretraining a Transformer language model on a powerful signal of document-level relatedness: the citation graph. Unlike existing pretrained language models, SPECTER can be easily applied to downstream applications without task-specific fine-tuning. Additionally, to encourage further research on document-level models, we introduce SciDocs, a new evaluation benchmark consisting of seven document-level tasks ranging from citation prediction, to document classification and recommendation. We show that SPECTER outperforms a variety of competitive baselines on the benchmark.




sin

Transfer Learning for EEG-Based Brain-Computer Interfaces: A Review of Progress Made Since 2016. (arXiv:2004.06286v3 [cs.HC] UPDATED)

A brain-computer interface (BCI) enables a user to communicate with a computer directly using brain signals. Electroencephalograms (EEGs) used in BCIs are weak, easily contaminated by interference and noise, non-stationary for the same subject, and varying across different subjects and sessions. Therefore, it is difficult to build a generic pattern recognition model in an EEG-based BCI system that is optimal for different subjects, during different sessions, for different devices and tasks. Usually, a calibration session is needed to collect some training data for a new subject, which is time consuming and user unfriendly. Transfer learning (TL), which utilizes data or knowledge from similar or relevant subjects/sessions/devices/tasks to facilitate learning for a new subject/session/device/task, is frequently used to reduce the amount of calibration effort. This paper reviews journal publications on TL approaches in EEG-based BCIs in the last few years, i.e., since 2016. Six paradigms and applications -- motor imagery, event-related potentials, steady-state visual evoked potentials, affective BCIs, regression problems, and adversarial attacks -- are considered. For each paradigm/application, we group the TL approaches into cross-subject/session, cross-device, and cross-task settings and review them separately. Observations and conclusions are made at the end of the paper, which may point to future research directions.




sin

Improved RawNet with Feature Map Scaling for Text-independent Speaker Verification using Raw Waveforms. (arXiv:2004.00526v2 [eess.AS] UPDATED)

Recent advances in deep learning have facilitated the design of speaker verification systems that directly input raw waveforms. For example, RawNet extracts speaker embeddings from raw waveforms, which simplifies the process pipeline and demonstrates competitive performance. In this study, we improve RawNet by scaling feature maps using various methods. The proposed mechanism utilizes a scale vector that adopts a sigmoid non-linear function. It refers to a vector with dimensionality equal to the number of filters in a given feature map. Using a scale vector, we propose to scale the feature map multiplicatively, additively, or both. In addition, we investigate replacing the first convolution layer with the sinc-convolution layer of SincNet. Experiments performed on the VoxCeleb1 evaluation dataset demonstrate the effectiveness of the proposed methods, and the best performing system reduces the equal error rate by half compared to the original RawNet. Expanded evaluation results obtained using the VoxCeleb1-E and VoxCeleb-H protocols marginally outperform existing state-of-the-art systems.




sin

Hierarchical Neural Architecture Search for Single Image Super-Resolution. (arXiv:2003.04619v2 [cs.CV] UPDATED)

Deep neural networks have exhibited promising performance in image super-resolution (SR). Most SR models follow a hierarchical architecture that contains both the cell-level design of computational blocks and the network-level design of the positions of upsampling blocks. However, designing SR models heavily relies on human expertise and is very labor-intensive. More critically, these SR models often contain a huge number of parameters and may not meet the requirements of computation resources in real-world applications. To address the above issues, we propose a Hierarchical Neural Architecture Search (HNAS) method to automatically design promising architectures with different requirements of computation cost. To this end, we design a hierarchical SR search space and propose a hierarchical controller for architecture search. Such a hierarchical controller is able to simultaneously find promising cell-level blocks and network-level positions of upsampling layers. Moreover, to design compact architectures with promising performance, we build a joint reward by considering both the performance and computation cost to guide the search process. Extensive experiments on five benchmark datasets demonstrate the superiority of our method over existing methods.




sin

SCAttNet: Semantic Segmentation Network with Spatial and Channel Attention Mechanism for High-Resolution Remote Sensing Images. (arXiv:1912.09121v2 [cs.CV] UPDATED)

High-resolution remote sensing images (HRRSIs) contain substantial ground object information, such as texture, shape, and spatial location. Semantic segmentation, which is an important task for element extraction, has been widely used in processing mass HRRSIs. However, HRRSIs often exhibit large intraclass variance and small interclass variance due to the diversity and complexity of ground objects, thereby bringing great challenges to a semantic segmentation task. In this paper, we propose a new end-to-end semantic segmentation network, which integrates lightweight spatial and channel attention modules that can refine features adaptively. We compare our method with several classic methods on the ISPRS Vaihingen and Potsdam datasets. Experimental results show that our method can achieve better semantic segmentation results. The source codes are available at https://github.com/lehaifeng/SCAttNet.




sin

Biologic and Prognostic Feature Scores from Whole-Slide Histology Images Using Deep Learning. (arXiv:1910.09100v4 [q-bio.QM] UPDATED)

Histopathology is a reflection of the molecular changes and provides prognostic phenotypes representing the disease progression. In this study, we introduced feature scores generated from hematoxylin and eosin histology images based on deep learning (DL) models developed for prostate pathology. We demonstrated that these feature scores were significantly prognostic for time to event endpoints (biochemical recurrence and cancer-specific survival) and had simultaneously molecular biologic associations to relevant genomic alterations and molecular subtypes using already trained DL models that were not previously exposed to the datasets of the current study. Further, we discussed the potential of such feature scores to improve the current tumor grading system and the challenges that are associated with tumor heterogeneity and the development of prognostic models from histology images. Our findings uncover the potential of feature scores from histology images as digital biomarkers in precision medicine and as an expanding utility for digital pathology.




sin

Imitation Learning for Human-robot Cooperation Using Bilateral Control. (arXiv:1909.13018v2 [cs.RO] UPDATED)

Robots are required to operate autonomously in response to changing situations. Previously, imitation learning using 4ch-bilateral control was demonstrated to be suitable for imitation of object manipulation. However, cooperative work between humans and robots has not yet been verified in these studies. In this study, the task was expanded by cooperative work between a human and a robot. 4ch-bilateral control was used to collect training data for training robot motion. We focused on serving salad as a task in the home. The task was executed with a spoon and a fork fixed to robots. Adjustment of force was indispensable in manipulating indefinitely shaped objects such as salad. Results confirmed the effectiveness of the proposed method as demonstrated by the success of the task.




sin

Box Covers and Domain Orderings for Beyond Worst-Case Join Processing. (arXiv:1909.12102v2 [cs.DB] UPDATED)

Recent beyond worst-case optimal join algorithms Minesweeper and its generalization Tetris have brought the theory of indexing and join processing together by developing a geometric framework for joins. These algorithms take as input an index $mathcal{B}$, referred to as a box cover, that stores output gaps that can be inferred from traditional indexes, such as B+ trees or tries, on the input relations. The performances of these algorithms highly depend on the certificate of $mathcal{B}$, which is the smallest subset of gaps in $mathcal{B}$ whose union covers all of the gaps in the output space of a query $Q$. We study how to generate box covers that contain small size certificates to guarantee efficient runtimes for these algorithms. First, given a query $Q$ over a set of relations of size $N$ and a fixed set of domain orderings for the attributes, we give a $ ilde{O}(N)$-time algorithm called GAMB which generates a box cover for $Q$ that is guaranteed to contain the smallest size certificate across any box cover for $Q$. Second, we show that finding a domain ordering to minimize the box cover size and certificate is NP-hard through a reduction from the 2 consecutive block minimization problem on boolean matrices. Our third contribution is a $ ilde{O}(N)$-time approximation algorithm called ADORA to compute domain orderings, under which one can compute a box cover of size $ ilde{O}(K^r)$, where $K$ is the minimum box cover for $Q$ under any domain ordering and $r$ is the maximum arity of any relation. This guarantees certificates of size $ ilde{O}(K^r)$. We combine ADORA and GAMB with Tetris to form a new algorithm we call TetrisReordered, which provides several new beyond worst-case bounds. On infinite families of queries, TetrisReordered's runtimes are unboundedly better than the bounds stated in prior work.




sin

Numerical study on the effect of geometric approximation error in the numerical solution of PDEs using a high-order curvilinear mesh. (arXiv:1908.09917v2 [math.NA] UPDATED)

When time-dependent partial differential equations (PDEs) are solved numerically in a domain with curved boundary or on a curved surface, mesh error and geometric approximation error caused by the inaccurate location of vertices and other interior grid points, respectively, could be the main source of the inaccuracy and instability of the numerical solutions of PDEs. The role of these geometric errors in deteriorating the stability and particularly the conservation properties are largely unknown, which seems to necessitate very fine meshes especially to remove geometric approximation error. This paper aims to investigate the effect of geometric approximation error by using a high-order mesh with negligible geometric approximation error, even for high order polynomial of order p. To achieve this goal, the high-order mesh generator from CAD geometry called NekMesh is adapted for surface mesh generation in comparison to traditional meshes with non-negligible geometric approximation error. Two types of numerical tests are considered. Firstly, the accuracy of differential operators is compared for various p on a curved element of the sphere. Secondly, by applying the method of moving frames, four different time-dependent PDEs on the sphere are numerically solved to investigate the impact of geometric approximation error on the accuracy and conservation properties of high-order numerical schemes for PDEs on the sphere.




sin

Single use register automata for data words. (arXiv:1907.10504v2 [cs.FL] UPDATED)

Our starting point are register automata for data words, in the style of Kaminski and Francez. We study the effects of the single-use restriction, which says that a register is emptied immediately after being used. We show that under the single-use restriction, the theory of automata for data words becomes much more robust. The main results are: (a) five different machine models are equivalent as language acceptors, including one-way and two-way single-use register automata; (b) one can recover some of the algebraic theory of languages over finite alphabets, including a version of the Krohn-Rhodes Theorem; (c) there is also a robust theory of transducers, with four equivalent models, including two-way single use transducers and a variant of streaming string transducers for data words. These results are in contrast with automata for data words without the single-use restriction, where essentially all models are pairwise non-equivalent.




sin

Identifying Compromised Accounts on Social Media Using Statistical Text Analysis. (arXiv:1804.07247v3 [cs.SI] UPDATED)

Compromised accounts on social networks are regular user accounts that have been taken over by an entity with malicious intent. Since the adversary exploits the already established trust of a compromised account, it is crucial to detect these accounts to limit the damage they can cause. We propose a novel general framework for discovering compromised accounts by semantic analysis of text messages coming out from an account. Our framework is built on the observation that normal users will use language that is measurably different from the language that an adversary would use when the account is compromised. We use our framework to develop specific algorithms that use the difference of language models of users and adversaries as features in a supervised learning setup. Evaluation results show that the proposed framework is effective for discovering compromised accounts on social networks and a KL-divergence-based language model feature works best.




sin

Using hierarchical matrices in the solution of the time-fractional heat equation by multigrid waveform relaxation. (arXiv:1706.07632v3 [math.NA] UPDATED)

This work deals with the efficient numerical solution of the time-fractional heat equation discretized on non-uniform temporal meshes. Non-uniform grids are essential to capture the singularities of "typical" solutions of time-fractional problems. We propose an efficient space-time multigrid method based on the waveform relaxation technique, which accounts for the nonlocal character of the fractional differential operator. To maintain an optimal complexity, which can be obtained for the case of uniform grids, we approximate the coefficient matrix corresponding to the temporal discretization by its hierarchical matrix (${cal H}$-matrix) representation. In particular, the proposed method has a computational cost of ${cal O}(k N M log(M))$, where $M$ is the number of time steps, $N$ is the number of spatial grid points, and $k$ is a parameter which controls the accuracy of the ${cal H}$-matrix approximation. The efficiency and the good convergence of the algorithm, which can be theoretically justified by a semi-algebraic mode analysis, are demonstrated through numerical experiments in both one- and two-dimensional spaces.




sin

Seismic Shot Gather Noise Localization Using a Multi-Scale Feature-Fusion-Based Neural Network. (arXiv:2005.03626v1 [cs.CV])

Deep learning-based models, such as convolutional neural networks, have advanced various segments of computer vision. However, this technology is rarely applied to seismic shot gather noise localization problem. This letter presents an investigation on the effectiveness of a multi-scale feature-fusion-based network for seismic shot-gather noise localization. Herein, we describe the following: (1) the construction of a real-world dataset of seismic noise localization based on 6,500 seismograms; (2) a multi-scale feature-fusion-based detector that uses the MobileNet combined with the Feature Pyramid Net as the backbone; and (3) the Single Shot multi-box detector for box classification/regression. Additionally, we propose the use of the Focal Loss function that improves the detector's prediction accuracy. The proposed detector achieves an AP@0.5 of 78.67\% in our empirical evaluation.