z

NH-HAZE: An Image Dehazing Benchmark with Non-Homogeneous Hazy and Haze-Free Images. (arXiv:2005.03560v1 [cs.CV])

Image dehazing is an ill-posed problem that has been extensively studied in the recent years. The objective performance evaluation of the dehazing methods is one of the major obstacles due to the lacking of a reference dataset. While the synthetic datasets have shown important limitations, the few realistic datasets introduced recently assume homogeneous haze over the entire scene. Since in many real cases haze is not uniformly distributed we introduce NH-HAZE, a non-homogeneous realistic dataset with pairs of real hazy and corresponding haze-free images. This is the first non-homogeneous image dehazing dataset and contains 55 outdoor scenes. The non-homogeneous haze has been introduced in the scene using a professional haze generator that imitates the real conditions of hazy scenes. Additionally, this work presents an objective assessment of several state-of-the-art single image dehazing methods that were evaluated using NH-HAZE dataset.




z

Collaborative Deanonymization. (arXiv:2005.03535v1 [cs.CR])

We propose protocols to resolve the tension between anonymity and accountability in a peer-to-peer manner. Law enforcement can adopt this approach to solve crimes involving cryptocurrency and anonymization techniques. We illustrate how the protocols could apply to Monero rings and CoinJoin transactions in Bitcoin.




z

An asynchronous distributed and scalable generalized Nash equilibrium seeking algorithm for strongly monotone games. (arXiv:2005.03507v1 [cs.GT])

In this paper, we present three distributed algorithms to solve a class of generalized Nash equilibrium (GNE) seeking problems in strongly monotone games. The first one (SD-GENO) is based on synchronous updates of the agents, while the second and the third (AD-GEED and AD-GENO) represent asynchronous solutions that are robust to communication delays. AD-GENO can be seen as a refinement of AD-GEED, since it only requires node auxiliary variables, enhancing the scalability of the algorithm. Our main contribution is to prove converge to a variational GNE of the game via an operator-theoretic approach. Finally, we apply the algorithms to network Cournot games and show how different activation sequences and delays affect convergence. We also compare the proposed algorithms to the only other in the literature (ADAGNES), and observe that AD-GENO outperforms the alternative.




z

Anonymized GCN: A Novel Robust Graph Embedding Method via Hiding Node Position in Noise. (arXiv:2005.03482v1 [cs.LG])

Graph convolution network (GCN) have achieved state-of-the-art performance in the task of node prediction in the graph structure. However, with the gradual various of graph attack methods, there are lack of research on the robustness of GCN. At this paper, we will design a robust GCN method for node prediction tasks. Considering the graph structure contains two types of information: node information and connection information, and attackers usually modify the connection information to complete the interference with the prediction results of the node, we first proposed a method to hide the connection information in the generator, named Anonymized GCN (AN-GCN). By hiding the connection information in the graph structure in the generator through adversarial training, the accurate node prediction can be completed only by the node number rather than its specific position in the graph. Specifically, we first demonstrated the key to determine the embedding of a specific node: the row corresponding to the node of the eigenmatrix of the Laplace matrix, by target it as the output of the generator, we designed a method to hide the node number in the noise. Take the corresponding noise as input, we will obtain the connection structure of the node instead of directly obtaining. Then the encoder and decoder are spliced both in discriminator, so that after adversarial training, the generator and discriminator can cooperate to complete the encoding and decoding of the graph, then complete the node prediction. Finally, All node positions can generated by noise at the same time, that is to say, the generator will hides all the connection information of the graph structure. The evaluation shows that we only need to obtain the initial features and node numbers of the nodes to complete the node prediction, and the accuracy did not decrease, but increased by 0.0293.




z

NTIRE 2020 Challenge on NonHomogeneous Dehazing. (arXiv:2005.03457v1 [cs.CV])

This paper reviews the NTIRE 2020 Challenge on NonHomogeneous Dehazing of images (restoration of rich details in hazy image). We focus on the proposed solutions and their results evaluated on NH-Haze, a novel dataset consisting of 55 pairs of real haze free and nonhomogeneous hazy images recorded outdoor. NH-Haze is the first realistic nonhomogeneous haze dataset that provides ground truth images. The nonhomogeneous haze has been produced using a professional haze generator that imitates the real conditions of haze scenes. 168 participants registered in the challenge and 27 teams competed in the final testing phase. The proposed solutions gauge the state-of-the-art in image dehazing.




z

Successfully Applying the Stabilized Lottery Ticket Hypothesis to the Transformer Architecture. (arXiv:2005.03454v1 [cs.LG])

Sparse models require less memory for storage and enable a faster inference by reducing the necessary number of FLOPs. This is relevant both for time-critical and on-device computations using neural networks. The stabilized lottery ticket hypothesis states that networks can be pruned after none or few training iterations, using a mask computed based on the unpruned converged model. On the transformer architecture and the WMT 2014 English-to-German and English-to-French tasks, we show that stabilized lottery ticket pruning performs similar to magnitude pruning for sparsity levels of up to 85%, and propose a new combination of pruning techniques that outperforms all other techniques for even higher levels of sparsity. Furthermore, we confirm that the parameter's initial sign and not its specific value is the primary factor for successful training, and show that magnitude pruning cannot be used to find winning lottery tickets.




z

Parametrized Universality Problems for One-Counter Nets. (arXiv:2005.03435v1 [cs.FL])

We study the language universality problem for One-Counter Nets, also known as 1-dimensional Vector Addition Systems with States (1-VASS), parameterized either with an initial counter value, or with an upper bound on the allowed counter value during runs. The language accepted by an OCN (defined by reaching a final control state) is monotone in both parameters. This yields two natural questions: 1) Does there exist an initial counter value that makes the language universal? 2) Does there exist a sufficiently high ceiling so that the bounded language is universal? Despite the fact that unparameterized universality is Ackermann-complete and that these problems seem to reduce to checking basic structural properties of the underlying automaton, we show that in fact both problems are undecidable. We also look into the complexities of the problems for several decidable subclasses, namely for unambiguous, and deterministic systems, and for those over a single-letter alphabet.




z

Simultaneous topology and fastener layout optimization of assemblies considering joint failure. (arXiv:2005.03398v1 [cs.CE])

This paper provides a method for the simultaneous topology optimization of parts and their corresponding joint locations in an assembly. Therein, the joint locations are not discrete and predefined, but continuously movable. The underlying coupling equations allow for connecting dissimilar meshes and avoid the need for remeshing when joint locations change. The presented method models the force transfer at a joint location not only by using single spring elements but accounts for the size and type of the joints. When considering riveted or bolted joints, the local part geometry at the joint location consists of holes that are surrounded by material. For spot welds, the joint locations are filled with material and may be smaller than for bolts. The presented method incorporates these material and clearance zones into the simultaneously running topology optimization of the parts. Furthermore, failure of joints may be taken into account at the optimization stage, yielding assemblies connected in a fail-safe manner.




z

Semantic Signatures for Large-scale Visual Localization. (arXiv:2005.03388v1 [cs.CV])

Visual localization is a useful alternative to standard localization techniques. It works by utilizing cameras. In a typical scenario, features are extracted from captured images and compared with geo-referenced databases. Location information is then inferred from the matching results. Conventional schemes mainly use low-level visual features. These approaches offer good accuracy but suffer from scalability issues. In order to assist localization in large urban areas, this work explores a different path by utilizing high-level semantic information. It is found that object information in a street view can facilitate localization. A novel descriptor scheme called "semantic signature" is proposed to summarize this information. A semantic signature consists of type and angle information of visible objects at a spatial location. Several metrics and protocols are proposed for signature comparison and retrieval. They illustrate different trade-offs between accuracy and complexity. Extensive simulation results confirm the potential of the proposed scheme in large-scale applications. This paper is an extended version of a conference paper in CBMI'18. A more efficient retrieval protocol is presented with additional experiment results.




z

WSMN: An optimized multipurpose blind watermarking in Shearlet domain using MLP and NSGA-II. (arXiv:2005.03382v1 [cs.CR])

Digital watermarking is a remarkable issue in the field of information security to avoid the misuse of images in multimedia networks. Although access to unauthorized persons can be prevented through cryptography, it cannot be simultaneously used for copyright protection or content authentication with the preservation of image integrity. Hence, this paper presents an optimized multipurpose blind watermarking in Shearlet domain with the help of smart algorithms including MLP and NSGA-II. In this method, four copies of the robust copyright logo are embedded in the approximate coefficients of Shearlet by using an effective quantization technique. Furthermore, an embedded random sequence as a semi-fragile authentication mark is effectively extracted from details by the neural network. Due to performing an effective optimization algorithm for selecting optimum embedding thresholds, and also distinguishing the texture of blocks, the imperceptibility and robustness have been preserved. The experimental results reveal the superiority of the scheme with regard to the quality of watermarked images and robustness against hybrid attacks over other state-of-the-art schemes. The average PSNR and SSIM of the dual watermarked images are 38 dB and 0.95, respectively; Besides, it can effectively extract the copyright logo and locates forgery regions under severe attacks with satisfactory accuracy.




z

2kenize: Tying Subword Sequences for Chinese Script Conversion. (arXiv:2005.03375v1 [cs.CL])

Simplified Chinese to Traditional Chinese character conversion is a common preprocessing step in Chinese NLP. Despite this, current approaches have poor performance because they do not take into account that a simplified Chinese character can correspond to multiple traditional characters. Here, we propose a model that can disambiguate between mappings and convert between the two scripts. The model is based on subword segmentation, two language models, as well as a method for mapping between subword sequences. We further construct benchmark datasets for topic classification and script conversion. Our proposed method outperforms previous Chinese Character conversion approaches by 6 points in accuracy. These results are further confirmed in a downstream application, where 2kenize is used to convert pretraining dataset for topic classification. An error analysis reveals that our method's particular strengths are in dealing with code-mixing and named entities.




z

Regression Forest-Based Atlas Localization and Direction Specific Atlas Generation for Pancreas Segmentation. (arXiv:2005.03345v1 [cs.CV])

This paper proposes a fully automated atlas-based pancreas segmentation method from CT volumes utilizing atlas localization by regression forest and atlas generation using blood vessel information. Previous probabilistic atlas-based pancreas segmentation methods cannot deal with spatial variations that are commonly found in the pancreas well. Also, shape variations are not represented by an averaged atlas. We propose a fully automated pancreas segmentation method that deals with two types of variations mentioned above. The position and size of the pancreas is estimated using a regression forest technique. After localization, a patient-specific probabilistic atlas is generated based on a new image similarity that reflects the blood vessel position and direction information around the pancreas. We segment it using the EM algorithm with the atlas as prior followed by the graph-cut. In evaluation results using 147 CT volumes, the Jaccard index and the Dice overlap of the proposed method were 62.1% and 75.1%, respectively. Although we automated all of the segmentation processes, segmentation results were superior to the other state-of-the-art methods in the Dice overlap.




z

Bitvector-aware Query Optimization for Decision Support Queries (extended version). (arXiv:2005.03328v1 [cs.DB])

Bitvector filtering is an important query processing technique that can significantly reduce the cost of execution, especially for complex decision support queries with multiple joins. Despite its wide application, however, its implication to query optimization is not well understood.

In this work, we study how bitvector filters impact query optimization. We show that incorporating bitvector filters into query optimization straightforwardly can increase the plan space complexity by an exponential factor in the number of relations in the query. We analyze the plans with bitvector filters for star and snowflake queries in the plan space of right deep trees without cross products. Surprisingly, with some simplifying assumptions, we prove that, the plan of the minimal cost with bitvector filters can be found from a linear number of plans in the number of relations in the query. This greatly reduces the plan space complexity for such queries from exponential to linear.

Motivated by our analysis, we propose an algorithm that accounts for the impact of bitvector filters in query optimization. Our algorithm optimizes the join order for an arbitrary decision support query by choosing from a linear number of candidate plans in the number of relations in the query. We implement our algorithm in Microsoft SQL Server as a transformation rule. Our evaluation on both industry standard benchmarks and customer workload shows that, compared with the original Microsoft SQL Server, our technique reduces the total CPU execution time by 22%-64% for the workloads, with up to two orders of magnitude reduction in CPU execution time for individual queries.




z

Global Distribution of Google Scholar Citations: A Size-independent Institution-based Analysis. (arXiv:2005.03324v1 [cs.DL])

Most currently available schemes for performance based ranking of Universities or Research organizations, such as, Quacarelli Symonds (QS), Times Higher Education (THE), Shanghai University based All Research of World Universities (ARWU) use a variety of criteria that include productivity, citations, awards, reputation, etc., while Leiden and Scimago use only bibliometric indicators. The research performance evaluation in the aforesaid cases is based on bibliometric data from Web of Science or Scopus, which are commercially available priced databases. The coverage includes peer reviewed journals and conference proceedings. Google Scholar (GS) on the other hand, provides a free and open alternative to obtaining citations of papers available on the net, (though it is not clear exactly which journals are covered.) Citations are collected automatically from the net and also added to self created individual author profiles under Google Scholar Citations (GSC). This data was used by Webometrics Lab, Spain to create a ranked list of 4000+ institutions in 2016, based on citations from only the top 10 individual GSC profiles in each organization. (GSC excludes the top paper for reasons explained in the text; the simple selection procedure makes the ranked list size-independent as claimed by the Cybermetrics Lab). Using this data (Transparent Ranking TR, 2016), we find the regional and country wise distribution of GS-TR Citations. The size independent ranked list is subdivided into deciles of 400 institutions each and the number of institutions and citations of each country obtained for each decile. We test for correlation between institutional ranks between GS TR and the other ranking schemes for the top 20 institutions.




z

Boosting Cloud Data Analytics using Multi-Objective Optimization. (arXiv:2005.03314v1 [cs.DB])

Data analytics in the cloud has become an integral part of enterprise businesses. Big data analytics systems, however, still lack the ability to take user performance goals and budgetary constraints for a task, collectively referred to as task objectives, and automatically configure an analytic job to achieve these objectives. This paper presents a data analytics optimizer that can automatically determine a cluster configuration with a suitable number of cores as well as other system parameters that best meet the task objectives. At a core of our work is a principled multi-objective optimization (MOO) approach that computes a Pareto optimal set of job configurations to reveal tradeoffs between different user objectives, recommends a new job configuration that best explores such tradeoffs, and employs novel optimizations to enable such recommendations within a few seconds. We present efficient incremental algorithms based on the notion of a Progressive Frontier for realizing our MOO approach and implement them into a Spark-based prototype. Detailed experiments using benchmark workloads show that our MOO techniques provide a 2-50x speedup over existing MOO methods, while offering good coverage of the Pareto frontier. When compared to Ottertune, a state-of-the-art performance tuning system, our approach recommends configurations that yield 26\%-49\% reduction of running time of the TPCx-BB benchmark while adapting to different application preferences on multiple objectives.




z

Nakdan: Professional Hebrew Diacritizer. (arXiv:2005.03312v1 [cs.CL])

We present a system for automatic diacritization of Hebrew text. The system combines modern neural models with carefully curated declarative linguistic knowledge and comprehensive manually constructed tables and dictionaries. Besides providing state of the art diacritization accuracy, the system also supports an interface for manual editing and correction of the automatic output, and has several features which make it particularly useful for preparation of scientific editions of Hebrew texts. The system supports Modern Hebrew, Rabbinic Hebrew and Poetic Hebrew. The system is freely accessible for all use at this http URL




z

Interval type-2 fuzzy logic system based similarity evaluation for image steganography. (arXiv:2005.03310v1 [cs.MM])

Similarity measure, also called information measure, is a concept used to distinguish different objects. It has been studied from different contexts by employing mathematical, psychological, and fuzzy approaches. Image steganography is the art of hiding secret data into an image in such a way that it cannot be detected by an intruder. In image steganography, hiding secret data in the plain or non-edge regions of the image is significant due to the high similarity and redundancy of the pixels in their neighborhood. However, the similarity measure of the neighboring pixels, i.e., their proximity in color space, is perceptual rather than mathematical. This paper proposes an interval type 2 fuzzy logic system (IT2 FLS) to determine the similarity between the neighboring pixels by involving an instinctive human perception through a rule-based approach. The pixels of the image having high similarity values, calculated using the proposed IT2 FLS similarity measure, are selected for embedding via the least significant bit (LSB) method. We term the proposed procedure of steganography as IT2 FLS LSB method. Moreover, we have developed two more methods, namely, type 1 fuzzy logic system based least significant bits (T1FLS LSB) and Euclidean distance based similarity measures for least significant bit (SM LSB) steganographic methods. Experimental simulations were conducted for a collection of images and quality index metrics, such as PSNR, UQI, and SSIM are used. All the three steganographic methods are applied on datasets and the quality metrics are calculated. The obtained stego images and results are shown and thoroughly compared to determine the efficacy of the IT2 FLS LSB method. Finally, we have done a comparative analysis of the proposed approach with the existing well-known steganographic methods to show the effectiveness of our proposed steganographic method.




z

On the unique solution of the generalized absolute value equation. (arXiv:2005.03287v1 [math.NA])

In this paper, some useful necessary and sufficient conditions for the unique solution of the generalized absolute value equation (GAVE) $Ax-B|x|=b$ with $A, Bin mathbb{R}^{n imes n}$ from the optimization field are first presented, which cover the fundamental theorem for the unique solution of the linear system $Ax=b$ with $Ain mathbb{R}^{n imes n}$. Not only that, some new sufficient conditions for the unique solution of the GAVE are obtained, which are weaker than the previous published works.




z

RNN-T Models Fail to Generalize to Out-of-Domain Audio: Causes and Solutions. (arXiv:2005.03271v1 [eess.AS])

In recent years, all-neural end-to-end approaches have obtained state-of-the-art results on several challenging automatic speech recognition (ASR) tasks. However, most existing works focus on building ASR models where train and test data are drawn from the same domain. This results in poor generalization characteristics on mismatched-domains: e.g., end-to-end models trained on short segments perform poorly when evaluated on longer utterances. In this work, we analyze the generalization properties of streaming and non-streaming recurrent neural network transducer (RNN-T) based end-to-end models in order to identify model components that negatively affect generalization performance. We propose two solutions: combining multiple regularization techniques during training, and using dynamic overlapping inference. On a long-form YouTube test set, when the non-streaming RNN-T model is trained with shorter segments of data, the proposed combination improves word error rate (WER) from 22.3% to 14.8%; when the streaming RNN-T model trained on short Search queries, the proposed techniques improve WER on the YouTube set from 67.0% to 25.3%. Finally, when trained on Librispeech, we find that dynamic overlapping inference improves WER on YouTube from 99.8% to 33.0%.




z

Online Proximal-ADMM For Time-varying Constrained Convex Optimization. (arXiv:2005.03267v1 [eess.SY])

This paper considers a convex optimization problem with cost and constraints that evolve over time. The function to be minimized is strongly convex and possibly non-differentiable, and variables are coupled through linear constraints.In this setting, the paper proposes an online algorithm based on the alternating direction method of multipliers(ADMM), to track the optimal solution trajectory of the time-varying problem; in particular, the proposed algorithm consists of a primal proximal gradient descent step and an appropriately perturbed dual ascent step. The paper derives tracking results, asymptotic bounds, and linear convergence results. The proposed algorithm is then specialized to a multi-area power grid optimization problem, and our numerical results verify the desired properties.




z

Coding for Optimized Writing Rate in DNA Storage. (arXiv:2005.03248v1 [cs.IT])

A method for encoding information in DNA sequences is described. The method is based on the precision-resolution framework, and is aimed to work in conjunction with a recently suggested terminator-free template independent DNA synthesis method. The suggested method optimizes the amount of information bits per synthesis time unit, namely, the writing rate. Additionally, the encoding scheme studied here takes into account the existence of multiple copies of the DNA sequence, which are independently distorted. Finally, quantizers for various run-length distributions are designed.




z

Phase retrieval of complex-valued objects via a randomized Kaczmarz method. (arXiv:2005.03238v1 [cs.IT])

This paper investigates the convergence of the randomized Kaczmarz algorithm for the problem of phase retrieval of complex-valued objects. While this algorithm has been studied for the real-valued case}, its generalization to the complex-valued case is nontrivial and has been left as a conjecture. This paper establishes the connection between the convergence of the algorithm and the convexity of an objective function. Based on the connection, it demonstrates that when the sensing vectors are sampled uniformly from a unit sphere and the number of sensing vectors $m$ satisfies $m>O(nlog n)$ as $n, m ightarrowinfty$, then this algorithm with a good initialization achieves linear convergence to the solution with high probability.




z

A Stochastic Geometry Approach to Doppler Characterization in a LEO Satellite Network. (arXiv:2005.03205v1 [cs.IT])

A Non-terrestrial Network (NTN) comprising Low Earth Orbit (LEO) satellites can enable connectivity to underserved areas, thus complementing existing telecom networks. The high-speed satellite motion poses several challenges at the physical layer such as large Doppler frequency shifts. In this paper, an analytical framework is developed for statistical characterization of Doppler shift in an NTN where LEO satellites provide communication services to terrestrial users. Using tools from stochastic geometry, the users within a cell are grouped into disjoint clusters to limit the differential Doppler across users. Under some simplifying assumptions, the cumulative distribution function (CDF) and the probability density function are derived for the Doppler shift magnitude at a random user within a cluster. The CDFs are also provided for the minimum and the maximum Doppler shift magnitude within a cluster. Leveraging the analytical results, the interplay between key system parameters such as the cluster size and satellite altitude is examined. Numerical results validate the insights obtained from the analysis.




z

Enabling Cross-chain Transactions: A Decentralized Cryptocurrency Exchange Protocol. (arXiv:2005.03199v1 [cs.CR])

Inspired by Bitcoin, many different kinds of cryptocurrencies based on blockchain technology have turned up on the market. Due to the special structure of the blockchain, it has been deemed impossible to directly trade between traditional currencies and cryptocurrencies or between different types of cryptocurrencies. Generally, trading between different currencies is conducted through a centralized third-party platform. However, it has the problem of a single point of failure, which is vulnerable to attacks and thus affects the security of the transactions. In this paper, we propose a distributed cryptocurrency trading scheme to solve the problem of centralized exchanges, which can achieve trading between different types of cryptocurrencies. Our scheme is implemented with smart contracts on the Ethereum blockchain and deployed on the Ethereum test network. We not only implement transactions between individual users, but also allow transactions between multiple users. The experimental result proves that the cost of our scheme is acceptable.




z

Recognizing Exercises and Counting Repetitions in Real Time. (arXiv:2005.03194v1 [cs.CV])

Artificial intelligence technology has made its way absolutely necessary in a variety of industries including the fitness industry. Human pose estimation is one of the important researches in the field of Computer Vision for the last few years. In this project, pose estimation and deep machine learning techniques are combined to analyze the performance and report feedback on the repetitions of performed exercises in real-time. Involving machine learning technology in the fitness industry could help the judges to count repetitions of any exercise during Weightlifting or CrossFit competitions.




z

Distributed Stabilization by Probability Control for Deterministic-Stochastic Large Scale Systems : Dissipativity Approach. (arXiv:2005.03193v1 [eess.SY])

By using dissipativity approach, we establish the stability condition for the feedback connection of a deterministic dynamical system $Sigma$ and a stochastic memoryless map $Psi$. After that, we extend the result to the class of large scale systems in which: $Sigma$ consists of many sub-systems; and $Psi$ consists of many "stochastic actuators" and "probability controllers" that control the actuator's output events. We will demonstrate the proposed approach by showing the design procedures to globally stabilize the manufacturing systems while locally balance the stock levels in any production process.




z

Determinantal Point Processes in Randomized Numerical Linear Algebra. (arXiv:2005.03185v1 [cs.DS])

Randomized Numerical Linear Algebra (RandNLA) uses randomness to develop improved algorithms for matrix problems that arise in scientific computing, data science, machine learning, etc. Determinantal Point Processes (DPPs), a seemingly unrelated topic in pure and applied mathematics, is a class of stochastic point processes with probability distribution characterized by sub-determinants of a kernel matrix. Recent work has uncovered deep and fruitful connections between DPPs and RandNLA which lead to new guarantees and improved algorithms that are of interest to both areas. We provide an overview of this exciting new line of research, including brief introductions to RandNLA and DPPs, as well as applications of DPPs to classical linear algebra tasks such as least squares regression, low-rank approximation and the Nystr"om method. For example, random sampling with a DPP leads to new kinds of unbiased estimators for least squares, enabling more refined statistical and inferential understanding of these algorithms; a DPP is, in some sense, an optimal randomized algorithm for the Nystr"om method; and a RandNLA technique called leverage score sampling can be derived as the marginal distribution of a DPP. We also discuss recent algorithmic developments, illustrating that, while not quite as efficient as standard RandNLA techniques, DPP-based algorithms are only moderately more expensive.




z

Evolutionary Multi Objective Optimization Algorithm for Community Detection in Complex Social Networks. (arXiv:2005.03181v1 [cs.NE])

Most optimization-based community detection approaches formulate the problem in a single or bi-objective framework. In this paper, we propose two variants of a three-objective formulation using a customized non-dominated sorting genetic algorithm III (NSGA-III) to find community structures in a network. In the first variant, named NSGA-III-KRM, we considered Kernel k means, Ratio cut, and Modularity, as the three objectives, whereas the second variant, named NSGA-III-CCM, considers Community score, Community fitness and Modularity, as three objective functions. Experiments are conducted on four benchmark network datasets. Comparison with state-of-the-art approaches along with decomposition-based multi-objective evolutionary algorithm variants (MOEA/D-KRM and MOEA/D-CCM) indicates that the proposed variants yield comparable or better results. This is particularly significant because the addition of the third objective does not worsen the results of the other two objectives. We also propose a simple method to rank the Pareto solutions so obtained by proposing a new measure, namely the ratio of the hyper-volume and inverted generational distance (IGD). The higher the ratio, the better is the Pareto set. This strategy is particularly useful in the absence of empirical attainment function in the multi-objective framework, where the number of objectives is more than two.




z

A Parameterized Perspective on Attacking and Defending Elections. (arXiv:2005.03176v1 [cs.GT])

We consider the problem of protecting and manipulating elections by recounting and changing ballots, respectively. Our setting involves a plurality-based election held across multiple districts, and the problem formulations are based on the model proposed recently by~[Elkind et al, IJCAI 2019]. It turns out that both of the manipulation and protection problems are NP-complete even in fairly simple settings. We study these problems from a parameterized perspective with the goal of establishing a more detailed complexity landscape. The parameters we consider include the number of voters, and the budgets of the attacker and the defender. While we observe fixed-parameter tractability when parameterizing by number of voters, our main contribution is a demonstration of parameterized hardness when working with the budgets of the attacker and the defender.




z

On Optimal Control of Discounted Cost Infinite-Horizon Markov Decision Processes Under Local State Information Structures. (arXiv:2005.03169v1 [eess.SY])

This paper investigates a class of optimal control problems associated with Markov processes with local state information. The decision-maker has only local access to a subset of a state vector information as often encountered in decentralized control problems in multi-agent systems. Under this information structure, part of the state vector cannot be observed. We leverage ab initio principles and find a new form of Bellman equations to characterize the optimal policies of the control problem under local information structures. The dynamic programming solutions feature a mixture of dynamics associated unobservable state components and the local state-feedback policy based on the observable local information. We further characterize the optimal local-state feedback policy using linear programming methods. To reduce the computational complexity of the optimal policy, we propose an approximate algorithm based on virtual beliefs to find a sub-optimal policy. We show the performance bounds on the sub-optimal solution and corroborate the results with numerical case studies.




z

Decentralized Adaptive Control for Collaborative Manipulation of Rigid Bodies. (arXiv:2005.03153v1 [cs.RO])

In this work, we consider a group of robots working together to manipulate a rigid object to track a desired trajectory in $SE(3)$. The robots have no explicit communication network among them, and they do no know the mass or friction properties of the object, or where they are attached to the object. However, we assume they share data from a common IMU placed arbitrarily on the object. To solve this problem, we propose a decentralized adaptive control scheme wherein each agent maintains and adapts its own estimate of the object parameters in order to track a reference trajectory. We present an analysis of the controller's behavior, and show that all closed-loop signals remain bounded, and that the system trajectory will almost always (except for initial conditions on a set of measure zero) converge to the desired trajectory. We study the proposed controller's performance using numerical simulations of a manipulation task in 3D, and with hardware experiments which demonstrate our algorithm on a planar manipulation task. These studies, taken together, demonstrate the effectiveness of the proposed controller even in the presence of numerous unmodelled effects, such as discretization errors and complex frictional interactions.




z

Scale-Equalizing Pyramid Convolution for Object Detection. (arXiv:2005.03101v1 [cs.CV])

Feature pyramid has been an efficient method to extract features at different scales. Development over this method mainly focuses on aggregating contextual information at different levels while seldom touching the inter-level correlation in the feature pyramid. Early computer vision methods extracted scale-invariant features by locating the feature extrema in both spatial and scale dimension. Inspired by this, a convolution across the pyramid level is proposed in this study, which is termed pyramid convolution and is a modified 3-D convolution. Stacked pyramid convolutions directly extract 3-D (scale and spatial) features and outperforms other meticulously designed feature fusion modules. Based on the viewpoint of 3-D convolution, an integrated batch normalization that collects statistics from the whole feature pyramid is naturally inserted after the pyramid convolution. Furthermore, we also show that the naive pyramid convolution, together with the design of RetinaNet head, actually best applies for extracting features from a Gaussian pyramid, whose properties can hardly be satisfied by a feature pyramid. In order to alleviate this discrepancy, we build a scale-equalizing pyramid convolution (SEPC) that aligns the shared pyramid convolution kernel only at high-level feature maps. Being computationally efficient and compatible with the head design of most single-stage object detectors, the SEPC module brings significant performance improvement ($>4$AP increase on MS-COCO2017 dataset) in state-of-the-art one-stage object detectors, and a light version of SEPC also has $sim3.5$AP gain with only around 7% inference time increase. The pyramid convolution also functions well as a stand-alone module in two-stage object detectors and is able to improve the performance by $sim2$AP. The source code can be found at https://github.com/jshilong/SEPC.




z

Optimal Location of Cellular Base Station via Convex Optimization. (arXiv:2005.03099v1 [cs.IT])

An optimal base station (BS) location depends on the traffic (user) distribution, propagation pathloss and many system parameters, which renders its analytical study difficult so that numerical algorithms are widely used instead. In this paper, the problem is studied analytically. First, it is formulated as a convex optimization problem to minimize the total BS transmit power subject to quality-of-service (QoS) constraints, which also account for fairness among users. Due to its convex nature, Karush-Kuhn-Tucker (KKT) conditions are used to characterize a globally-optimum location as a convex combination of user locations, where convex weights depend on user parameters, pathloss exponent and overall geometry of the problem. Based on this characterization, a number of closed-form solutions are obtained. In particular, the optimum BS location is the mean of user locations in the case of free-space propagation and identical user parameters. If the user set is symmetric (as defined in the paper), the optimal BS location is independent of pathloss exponent, which is not the case in general. The analytical results show the impact of propagation conditions as well as system and user parameters on optimal BS location and can be used to develop design guidelines.




z

Robust Trajectory and Transmit Power Optimization for Secure UAV-Enabled Cognitive Radio Networks. (arXiv:2005.03091v1 [cs.IT])

Cognitive radio is a promising technology to improve spectral efficiency. However, the secure performance of a secondary network achieved by using physical layer security techniques is limited by its transmit power and channel fading. In order to tackle this issue, a cognitive unmanned aerial vehicle (UAV) communication network is studied by exploiting the high flexibility of a UAV and the possibility of establishing line-of-sight links. The average secrecy rate of the secondary network is maximized by robustly optimizing the UAV's trajectory and transmit power. Our problem formulation takes into account two practical inaccurate location estimation cases, namely, the worst case and the outage-constrained case. In order to solve those challenging non-convex problems, an iterative algorithm based on $mathcal{S}$-Procedure is proposed for the worst case while an iterative algorithm based on Bernstein-type inequalities is proposed for the outage-constrained case. The proposed algorithms can obtain effective suboptimal solutions of the corresponding problems. Our simulation results demonstrate that the algorithm under the outage-constrained case can achieve a higher average secrecy rate with a low computational complexity compared to that of the algorithm under the worst case. Moreover, the proposed schemes can improve the secure communication performance significantly compared to other benchmark schemes.




z

A Multifactorial Optimization Paradigm for Linkage Tree Genetic Algorithm. (arXiv:2005.03090v1 [cs.NE])

Linkage Tree Genetic Algorithm (LTGA) is an effective Evolutionary Algorithm (EA) to solve complex problems using the linkage information between problem variables. LTGA performs well in various kinds of single-task optimization and yields promising results in comparison with the canonical genetic algorithm. However, LTGA is an unsuitable method for dealing with multi-task optimization problems. On the other hand, Multifactorial Optimization (MFO) can simultaneously solve independent optimization problems, which are encoded in a unified representation to take advantage of the process of knowledge transfer. In this paper, we introduce Multifactorial Linkage Tree Genetic Algorithm (MF-LTGA) by combining the main features of both LTGA and MFO. MF-LTGA is able to tackle multiple optimization tasks at the same time, each task learns the dependency between problem variables from the shared representation. This knowledge serves to determine the high-quality partial solutions for supporting other tasks in exploring the search space. Moreover, MF-LTGA speeds up convergence because of knowledge transfer of relevant problems. We demonstrate the effectiveness of the proposed algorithm on two benchmark problems: Clustered Shortest-Path Tree Problem and Deceptive Trap Function. In comparison to LTGA and existing methods, MF-LTGA outperforms in quality of the solution or in computation time.




z

I Always Feel Like Somebody's Sensing Me! A Framework to Detect, Identify, and Localize Clandestine Wireless Sensors. (arXiv:2005.03068v1 [cs.CR])

The increasing ubiquity of low-cost wireless sensors in smart homes and buildings has enabled users to easily deploy systems to remotely monitor and control their environments. However, this raises privacy concerns for third-party occupants, such as a hotel room guest who may be unaware of deployed clandestine sensors. Previous methods focused on specific modalities such as detecting cameras but do not provide a generalizable and comprehensive method to capture arbitrary sensors which may be "spying" on a user. In this work, we seek to determine whether one can walk in a room and detect any wireless sensor monitoring an individual. As such, we propose SnoopDog, a framework to not only detect wireless sensors that are actively monitoring a user, but also classify and localize each device. SnoopDog works by establishing causality between patterns in observable wireless traffic and a trusted sensor in the same space, e.g., an inertial measurement unit (IMU) that captures a user's movement. Once causality is established, SnoopDog performs packet inspection to inform the user about the monitoring device. Finally, SnoopDog localizes the clandestine device in a 2D plane using a novel trial-based localization technique. We evaluated SnoopDog across several devices and various modalities and were able to detect causality 96.6% percent of the time, classify suspicious devices with 100% accuracy, and localize devices to a sufficiently reduced sub-space.




z

Evaluating text coherence based on the graph of the consistency of phrases to identify symptoms of schizophrenia. (arXiv:2005.03008v1 [cs.CL])

Different state-of-the-art methods of the detection of schizophrenia symptoms based on the estimation of text coherence have been analyzed. The analysis of a text at the level of phrases has been suggested. The method based on the graph of the consistency of phrases has been proposed to evaluate the semantic coherence and the cohesion of a text. The semantic coherence, cohesion, and other linguistic features (lexical diversity, lexical density) have been taken into account to form feature vectors for the training of a model-classifier. The training of the classifier has been performed on the set of English-language interviews. According to the retrieved results, the impact of each feature on the output of the model has been analyzed. The results obtained can indicate that the proposed method based on the graph of the consistency of phrases may be used in the different tasks of the detection of mental illness.




z

How Personalized Landing Pages Can Make Your Site More Profitable

Personalization is one of the most effective marketing techniques to connect with customers online. While the exact methods are different for every business, adding personalized elements to landing pages is a proven method of driving conversions on your site. But why is it so successful? The simple answer is that personalization shows customers that you […]

The post How Personalized Landing Pages Can Make Your Site More Profitable appeared first on WebFX Blog.




z

Going Beyond Sales: 7 Types of Website Conversions to Optimize for on Your Website

If you’re looking to grow your business online, it’s time to start setting up different types of website conversions to help your company succeed. Whether you’re looking to earn more email subscribers or sell more products, you can set conversion goals that grow your business. On this page, we’ll discuss what a conversion goal is, […]

The post Going Beyond Sales: 7 Types of Website Conversions to Optimize for on Your Website appeared first on WebFX Blog.




z

Let's Discuss Memoization, or Should I Say Memoisation

“In computing, memoization or memoisation is an optimization technique used primarily to speed up computer programs by storing the results of expensive function calls and returning the cached result when the same inputs occur again” — Wikipedia article on memoization

I've written a two part article in Memoization in JavaScript. The first part explains the concept with basic implementation in JavaScript code. It details a way to apply the technique on function calls. It is generic to handle most JavaScript functions.





z

Dozens of Spokane, Coeur d'Alene events canceled due to public health concerns over COVID-19

After Governor Jay Inslee announced a prohibition on gatherings of 250 people or more in three Washington counties (Snohomish, King, Pierce) on Wednesday, and with public health concerns growing over the COVID-19 pandemic, many organizations in Spokane are following suit. The Inlander will be frequently updating its online calendar of events to reflect local cancelations as we hear of them.…



  • Culture/Arts & Culture

z

The Innovia Foundation's former president has finally won his three-year battle to stop the organization from donating to a racist website

There's one thing the Innovia Foundation can never say: That it hadn't been told.…



  • News/Local News

z

Privacy is disappearing faster than we realize, and the coronavirus isn't helping

The apps and devices you use are conducting surveillance with your every move Sure, you lock your home, and you probably don't share your deepest secrets with random strangers.…



  • News/Local News

z

Spokane musician Eliza Johnson brought her quirky style — and tinned fish — to American Idol Sunday night. Watch the clip

Back in November, we wrote about local singer-songwriter Eliza Johnson's musical project Eliza Catastrophe and her new album You, which she released on pre-loaded MP3 players. One thing we weren't able to mention in our interview — for contractual reasons — is that she had only a couple months prior auditioned for American Idol, and her performance finally aired on the ABC reality competition show Sunday night.…



  • Music/Music News

z

Tim and Eric rock the Beef House, Danzig sings Elvis, and more you need to know

The Buzz Bin HERE'S THE BEEF…



  • Arts & Culture

z

Don't expect any socially distanced Zags games in the Kennel next year, and other thoughts from Gonzaga Athletic Director Mike Roth's online Q&A

Gonzaga Athletic Director Mike Roth took to the Zoom online meeting app Wednesday for a lengthy chat with members of the school community, fans and media to answer questions about college sports in the era of COVID-19. Like so many things regarding the coronavirus, there are a lot of hopes for a rapid return to normalcy — all of them couched in the reality that none of us really know how the pandemic is going to affect our lives three months from now, or six months down the line.…




z

Local distilleries are relying on curbside bottle sales - and small batches of hand sanitizer - to stay afloat

Drink Local In tumultuous times, one thing remains true: People still want their spirits.…



  • Food/Food News

z

Why COVID-19 patients at the VA hospital in Spokane aren't counted as 'hospitalized'

If you go to check how many people are hospitalized with COVID-19 in Spokane, the Spokane County Regional Health District website will give you an answer. Right now, it lists eight people as currently hospitalized with COVID-19, and that number has been trending downward.…



  • News/Local News

z

The Zags are the WCC champs. But why was this season so surprising?

It’s March. The regular season is now in the rearview mirror.…