mix

An Overview of Semiparametric Extensions of Finite Mixture Models

Sijia Xiang, Weixin Yao, Guangren Yang.

Source: Statistical Science, Volume 34, Number 3, 391--404.

Abstract:
Finite mixture models have offered a very important tool for exploring complex data structures in many scientific areas, such as economics, epidemiology and finance. Semiparametric mixture models, which were introduced into traditional finite mixture models in the past decade, have brought forth exciting developments in their methodologies, theories, and applications. In this article, we not only provide a selective overview of the newly-developed semiparametric mixture models, but also discuss their estimation methodologies, theoretical properties if applicable, and some open questions. Recent developments are also discussed.




mix

Tony Rudd - Machadaynu (remix by The Freelance Hairdresser) - *Official Video*       [3m15s]


Original audio available as an mp3 from: http://soundcloud.com/the-freelance-hairdresser - also visit http://www.soundhog.co.uk for the rest [...]




mix

NYAN CAT DISCO REMIX       [1m50s]


Facebook http://fb.com/MyLostGames Nyan Cat Game http://mylostgames.com/play/nyan_cat_lost_in_space




mix

Rockets!: Disney's Man in Space Remix       [2m00s]


Follow Disney on Twitter: http://bit.ly/FollowDisney 3, 2, 1...Blast off! Enter Tomorrowland and join Walt Disney in space. This never-before-seen [...]




mix

For 100 Years, KitchenAid Has Been the Stand-Up Brand of Stand Mixers

Even celebrity chef Julia Child said that the sleek appliance made mixing 'marvelous'




mix

$200 cheques for Manitoba seniors draw mix of praise, criticism

Earlier this week, Manitoba's premier announced $200 cheques for seniors to help pay for increased costs during the COVID-19 pandemic. But some wonder if there is a better way to help those in need.



  • News/Canada/Manitoba

mix

Music reviews: Drake flounders on a surprise mixtape of leaks and demos

DRAKE - DARK LANE DEMO TAPES




mix

Comixology

If you're into comic books but you don't want to deal with stacks of back issues or the hassle of going to your local comic shop, Amazon's Comixology is the way to go.




mix

Therapeutic efficacy of a mixed formulation of conventional and PEGylated liposomes containing meglumine antimoniate, combined with allopurinol, in dogs naturally infected with Leishmania infantum [Experimental Therapeutics]

Treatment of dogs naturally infected with Leishmania infantum using meglumine antimoniate (MA) encapsulated in conventional liposomes (LC) in association with allopurinol has been previously reported to promote marked reduction in the parasite burden in the main infection sites. Here, a new assay in naturally infected dogs was performed using a novel liposome formulation of MA consisting of a mixture of conventional and long-circulating (PEGylated) liposomes (LCP), with expected broader distribution among affected tissues of the mononuclear phagocyte system. Experimental groups of naturally infected dogs were as follows: LCP+Allop, receiving LCP intravenously as 2 cycles of 6 doses (6.5 mg Sb/kg/dose) at 4-day intervals, plus allopurinol at 30 mg/kg/12 h p.o. during 130 days; LC+Allop, receiving LC intravenously as 2 cycles of 6 doses (6.5 mg Sb/kg/dose), plus allopurinol during 130 days; Allop, treated with allopurinol only; non-treated control. Parasite loads were evaluated by quantitative PCR in liver, spleen and bone marrow and by immunohistochemistry in the ear skin, before, just after treatment and 4 months later. LCP+Allop and LC+Allop groups, but not the Allop group, showed significant suppression of the parasites in the liver, spleen and bone marrow 4 months after treatment, compared to the pre-treatment period or the control group. Only LCP+Allop group showed significantly lower parasite burden in the skin, in comparison to the control group. On the basis of clinical staging and parasitological evaluations, LCP formulation exhibited a more favorable therapeutic profile, when compared to LC one, being therefore promising for treatment of canine visceral leishmaniasis.




mix

The 10 Best Titles For New Comixology Unlimited Subscribers

Comixology Unlimited lets you dive into more than 20,000 digital comics for just $5.99 per month. Make the most of your subscription by starting with these 10 titles.




mix

Mixed Computer Literacy Among Teachers Worldwide

Worldwide, teachers may struggle to help students learn computer skills, finds a study of computer literacy rates across a dozen countries.




mix

Fin24.com | Jozi's mixed Cup fortunes

Soweto and Sandton's windfalls evade the middle market elsewhere in the city.




mix

Fin24.com | Interest cover: a mixed bag

Despite the general declining trend, some companies have managed to reduce their interest bill significantly in relation to their earnings.




mix

Mixed languages, cultures and experiences

Coordinator Whitney Guthrie is grateful for five months of mixed languages, cultures and experiences during OM Chile’s first missions training for both foreigners and Chileans.




mix

Creating a Marketing Mix Model for a Better Marketing Budget: Analytics Corner

Using R programming, marketers can create a marketing mix model to determine how sustainable their audience channels are, and make better ad spend decisions. Here's how




mix

Google’s Hip Hop Doodle back after three years! Mix your music and groove for today

Bringing back an interactive Doodle from 2017, Google with its illustration is celebrating the birth of Hip Hop music.




mix

Risk from pesticide mixtures below threshold in Efsa assessments

But campaign group concerned about selected modelling tools




mix

World shares mixed amid hopes for business pickup; oil slides

il prices slid and world equity markets see sawed Wednesday as investor hopes for a pickup in business activity were weighed against downbeat economic data and low fuel demand highlighted by a rise in U.S. crude stockpiles to three-year highs.




mix

New Rapid Adoption Kit (RAK) Enables Productive Mixed-Signal, Low Power Structural Verification

All engineers can enhance their mixed-signal low-power structural verification productivity by learning while doing with a PIEA RAK (Power Intent Export Assistant Rapid Adoption Kit). They can verify the mixed-signal chip by a generating macromodel for their analog block automatically, and run it through Conformal Low Power (CLP) to perform a low power structural check.  

The power structure integrity of a mixed-signal, low-power block is verified via Conformal Low Power integrated into the Virtuoso Schematic Editor Power Intent Export Assistant (VSE-PIEA). Here is the flow.

 

Applying the flow iteratively from lower to higher levels can verify the power structure.

Cadence customers can learn more in a Rapid Adoption Kit (RAK) titled IC 6.1.5 Virtuoso Schematic Editor XL PIEA, Conformal Low Power: Mixed-Signal Low Power Structural Verification.

The RAK includes Rapid Adoption Kit with demo design (instructions are provided on how to setup the user environment). It Introduces the Power Intent Export Assistant (PIEA) feature that has been implemented in the Virtuoso IC615 release.  The power intent extracted is then verified by calling Conformal Low Power (CLP) inside the Virtuoso environment.

  • Last Update: 11/15/2012.
  • Validated with IC 6.1.5 and CLP 11.1

The RAK uses a sample test case to go through PIEA + CLP flow as follows:

  • Setup for PIEA
  • Perform power intent extraction
  • CPF Import: It is recommended to Import macro CPF, as oppose to designing CPF for sub-blocks. If you choose to import design CPF files please make sure the design CPF file has power domain information for all the top level boundary ports
  • Generate macro CPF and design CPF
  • Perform low power verification by running CLP

It is also recommended to go through older RAKs as prerequisites.

  • Conformal Low Power, RTL Compiler and Incisive: Low Power Verification for Beginners
  • Conformal Low Power: CPF Macro Models
  • Conformal Low Power and RTL Compiler: Low Power Verification for Advanced Users

To access all these RAKs, visit our RAK Home Page to access Synthesis, Test and Verification flow

Note: To access above docs, use your Cadence credentials to logon to the Cadence Online Support (COS) web site. Cadence Online Support website https://support.cadence.com/ is your 24/7 partner for getting help and resolving issues related to Cadence software. If you are signed up for e-mail notifications, you can receive new solutions, Application Notes (Technical Papers), Videos, Manuals, and more.

You can send us your feedback by adding a comment below or using the feedback box on Cadence Online Support.

Sumeet Aggarwal




mix

Mixed-signal and Low-power Demo -- Cadence Booth at DAC

DAC is right around the corner! On the demo floor at Cadence® Booth #2214, we will demonstrate how to use the Cadence mixed-signal and low-power solution to design, verify, and implement a microcontroller-based mixed-signal design. The demo design architecture is very similar to practical designs of many applications like power management ICs, automotive controllers, and the Internet of Things (IoT). Cadene tools demonstrated in this design include Virtuoso® Schematic Editor, Virtuoso Analog Design Environment, Virtuoso AMS Designer, Virtuoso Schematic Model Generator, Virtuoso Power Intent Assistant, Incisive® Enterprise Simulator with DMS option, Virtuoso Digital Implementation, Virtuoso Layout Suite, Encounter® RTL Compiler, Encounter Test, and Conformal Low Power. An extended version of this demo will also be shown at the ARM® Connected Community Pavilion Booth #921.

For additional highlights on Cadence mixed-signal and low-power solutions, stop by our booth for:

  • The popular book, Mixed-signal Methodology Guide, which will be on sale during DAC week!
  • A sneak preview of the eBook version of the Mixed-signal Methodology Guide
  • Customer presentations at the Cadence DAC Theater
    • 9am, Tuesday, June 4  ARM  Low-Power Verification of A15 Hard Macro Using CLP 
    • 10:30am, Tuesday, June 4  Silicon Labs  Power Mode Verification in Mixed-Signal Chip
    • 12:00pm, Tuesday, June 4  IBM  An Interoperable Flow with Unified OA and QRC Technology Files
    • 9am, Wednesday, June 5  Marvell  Low-Power Verification Using CLP
    • 4pm, Wednesday, June 5  Texas Instruments  An Inter-Operable Flow with Unified OA and QRC Technology Files
  • Partner presentations at the Cadence DAC Theater
    • 10am, Monday, June 3  X-Fab  Rapid Adoption of Advanced Cadence Design Flows Using X-FAB's AMS Reference Kit
    • 3:30pm, Monday, June 3  TSMC TSMC Custom Reference Flow for 20nm -  Cadence Track
    • 9:30am,Tuesday, June 4  TowerJazz   Substrate Noise Isolation Extraction/Model Using Cadence Analog Flow
    • 12:30pm, Wednesday, June 5  GLOBALFOUNDRIES  20nm/14nm Analog/Mixed-signal Flow
    • 2:30pm, Wednesday, June 5  ARM  Cortex®-M0 and Cortex-M0+: Tiny, Easy, and Energy-efficient Processors for Mixed-signal Applications
  • Technology sessions at suites
    • 10am, Monday, June 3    Low-power Verification of Mixed-signal Designs
    • 2pm, Monday, June 3      Advanced Implementation Techniques for Mixed-signal Designs
    • 2pm, Monday, June 3      LP Simulation: Are You Really Done?
    • 4pm, Monday, June 3      Power Format Update: Latest on CPF and IEEE 1801  
    • 11am, Wednesday, June 5   Mixed-signal Verification
    • 11am, Wednesday, June 5   LP Simulation: Are You Really Done?
    • 4pm, Wednesday, June 5   Successful RTL-to-GDSII Low-Power Design (FULL)
    • 5pm, Wednesday, June 5   Custom/AMS Design at Advanced Nodes

We will also have three presentations at the Si2 booth (#1427):

  • 10:30am, Monday, June 3   An Interoperable Implementation Solution for Mixed-signal Design
  • 11:30am, Tuesday, June 4   Low-power Verification for Mixed-signal Designs Using CPF
  • 10:30am, Wednesday, June 5   System-level Low-power Verification Using Palladium

 

We have a great program at DAC. Click the link for complete Cadence DAC Theater and Technology Sessions. Look forward to seeing you at DAC!     




mix

mixer pxf simulation error(IC5141,Cadence workshop document)

Hello

The document I referenced is https://filebox.ece.vt.edu/~symort/rfworkshop/Mixer_workshop_instruction.pdf. (This is cadence workshop document)

While following the pxf simulation in the above article, the results are different and I have a question.

My result picture is shown below.

<my result error>

<document result>

<my direct plot>

<document direct plot>

The difference with the documentation is that in the direct plot screen after the pxf simulation,

1.output harmonics-> input sideband

2.Frequency axis: out-> frequency axis: absin

3.The results for port0 (RF port) are also different (see photo below).

4.The frequency values in the box are different.

My screen shows 5G, 10G, 1K ~ 10M, but the document is the same as 1K ~ 10M.

Ask for a solution. Thank you.




mix

gm of an active mixer

Hi all,

What is the most accurate way to simulate the gm of  RF transistors (RF stage) of an active mixer (single balanced or Gilbert cell)?

I tried to simulate it with many ways such as:

1. DC annotation (but of course its incorrect due to the switching operation of the mixer)

2. d(i_ds)/d(v_gs) using HB analysis and then taking the value at zero (since it is a DC characteristic). In this way I chose in the simulator results of HB: Voltage, spectrum, rms, magnitude. 

3. Using the OP, OPT buttons in the calculator and then extracting the gm of the transistor. 

The problem is that each way gives a different value which makes the procedure of designing an active mixer very difficult. In addition, when I simulate the voltage conversion gain of the active mixer and trying to compare it to the formula (2/pi)*gm*RL (either in linear or dB), I get numbers which are way too far from simulations. I understand that I would not get the same results but not different by hundreds percent. 

I see in many publications that people are plotting graphs of mixer's gm vs. different parameters and starting to doubt whether the results are correct.

I would appreciate any help,

Thanks in advance




mix

Gilbert mixer IIP3

Hi all,

I am having trouble plotting the IIp3 of gilber RF mixer I made

I have plotted 1 dB compression point using QPSS and QPAC simulation. flo=2.42GHz and frf=2.4GHz , 20 MHz IF

However my IIp3 simulation shows strange results

QPSS and QPAC setup




mix

Library Characterization Tidbits: Exploring Intuitive Means to Characterize Large Mixed-Signal Blocks

Let’s review a key characteristic feature of Cadence Liberate AMS Mixed-Signal Characterization that offers to you ease of use along with many other benefits like automation of standard Liberty model creation and improvement of up to 20X throughput.(read more)




mix

The Elephant in the Room: Mixed-Signal Models

Key Findings:  Nearly 100% of SoCs are mixed-signal to some extent.  Every one of these could benefit from the use of a metrics-driven unified verification methodology for mixed-signal (MD-UVM-MS), but the modeling step is the biggest hurdle to overcome.  Without the magical models, the process breaks down for lack of performance, or holes in the chip verification.

In the last installment of The Low Road, we were at the mixed-signal verification party. While no one talked about it, we all saw it: The party was raging and everyone was having a great time, but they were all dancing around that big elephant right in the middle of the room. For mixed-signal verification, that elephant is named Modeling.

To get to a fully verified SoC, the analog portions of the design have to run orders of magnitude faster than the speediest SPICE engine available. That means an abstraction of the behavior must be created. It puts a lot of people off when you tell them they have to do something extra to get done with something sooner. Guess what, it couldn’t be more true. If you want to keep dancing around like the elephant isn’t there, then enjoy your day. If you want to see about clearing the pachyderm from the dance floor, you’ll want to read on a little more….

Figure 1: The elephant in the room: who’s going to create the model?

 Whose job is it?

Modeling analog/mixed-signal behavior for use in SoC verification seems like the ultimate hot potato.  The analog team that creates the IP blocks says it doesn't have the expertise in digital verification to create a high-performance model. The digital designers say they don’t understand anything but ones and zeroes. The verification team, usually digitally-centric by background, are stuck in the middle (and have historically said “I just use the collateral from the design teams to do my job; I don’t create it”).

If there is an SoC verification team, then ensuring that the entire chip is verified ultimately rests upon their shoulders, whether or not they get all of the models they need from the various design teams for the project. That means that if a chip does not work because of a modeling error, it ought to point back to the verification team. If not, is it just a “systemic error” not accounted for in the methodology? That seems like a bad answer.

That all makes the most valuable guy in the room the engineer, whose knowledge spans the three worlds of analog, digital, and verification. There are a growing number of “mixed-signal verification engineers” found on SoC verification teams. Having a specialist appears to be the best approach to getting the job done, and done right.

So, my vote is for the verification team to step up and incorporate the expertise required to do a complete job of SoC verification, analog included. (I know my popularity probably did not soar with the attendees of DVCON with that statement, but the job has to get done).

It’s a game of trade-offs

The difference in computations required for continuous time versus discrete time behavior is orders of magnitude (as seen in Figure 2 below). The essential detail versus runtime tradeoff is a key enabler of verification techniques like software-driven testbenches. Abstraction is a lossy process, so care must be taken to fully understand the loss and test those elements in the appropriate domain (continuous time, frequency, etc.).

Figure 2: Modeling is required for performance

 

AFE for instance

The traditional separation of baseband and analog front-end (AFE) chips has shifted for the past several years. Advances in process technology, analog-to-digital converters, and the desire for cost reduction have driven both a re-architecting and re-partitioning of the long-standing baseband/AFE solution. By moving more digital processing to the AFE, lower cost architectures can be created, as well as reducing those 130 or so PCB traces between the chips.

There is lots of good scholarly work from a few years back on this subject, such as Digital Compensation of Dynamic Acquisition Errors at the Front-End of ADCS and Digital Compensation for Analog Front-Ends: A New Approach to Wireless Transceiver Design.


Figure 3: AFE evolution from first reference (Parastoo)

The digital calibration and compensation can be achieved by the introduction of a programmable solution. This is in fact the most popular approach amongst the mobile crowd today. By using a microcontroller, the software algorithms become adaptable to process-related issues and modifications to protocol standards.

However, for the SoC verification team, their job just got a whole lot harder. To determine if the interplay of the digital control and the analog function is working correctly, the software algorithms must be simulated on the combination of the two. That is, here is a classic case of inseparable mixed-signal verification.

So, what needs to be in the model is the big question. And the answer is, a lot. For this example, the main sources of dynamic error at the front-end of ADCs are critical for the non-linear digital filtering that is highly frequency dependent. The correction scheme must be verified to show that the nonlinearities are cancelled across the entire bandwidth of the ADC. 

This all means lots of simulation. It means that the right level of detail must be retained to ensure the integrity of the verification process. This means that domain experience must be added to the list of expertise of that mixed-signal verification engineer.

Back to the pachyderm

There is a lot more to say on this subject, and lots will be said in future posts. The important starting point is the recognition that the potential flaw in the system needs to be examined. It needs to be examined by a specialist.  Maybe a second opinion from the application domain is needed too.

So, put that cute little elephant on your desk as a reminder that the beast can be tamed.

 

 

Steve Carlson

Related stories

It’s Late, But the Party is Just Getting Started




mix

Mixing It Up in Hardware (an Advantest Case Study in Faster Full-Chip Simulations)

Key Findings: Advantest, in mixed-signal SoC design, sees 50X speedup, 25 day test reduced to 12 hours, dramatic test coverage increase.

Trolling through the CDNLive archives, I discovered another gem. At the May 2013 CDNLive in Munich, Thomas Henkel and Henriette Ossoinig of Advantest presented a paper titled “Timing-accurate emulation of a mixed-signal SoC using Palladium XP”. Advantest makes advanced electronics test equipment. Among the semiconductor designs they create for these products is a test processor chip with over 100 million logic transistors, but also with lots of analog functions.They set out to find a way to speed up their full-chip simulations to a point where they could run the system software. To do that, they needed about a 50X speed-up. Well, they did it!


Figure 1: Advantest SoC Test Products

 

To skip the commentary, read Advantest's paper here

Problem Statement

Software is becoming a bigger part of just about every hardware product in every market today, and that includes the semiconductor test market. To achieve high product quality in the shortest amount of time, the hardware and software components need to be verified together as early in the design cycle as possible. However, the throughput of a typical software RTL simulation is not sufficient to run significant amounts of software on a design with hundreds of millions of transistors.  

Executing software on RTL models of the hardware means long runs  (“deep cycles”) that are a great fit for an emulator, but the mixed-signal content posed a new type of challenge for the Advantest team.  Emulators are designed to run digital logic. Analog is really outside of the expected use model. The Advantest team examined the pros and cons of various co-simulation and acceleration flows intended for mixed signal and did not feel that they could possibly get the performance they needed to have practical runtimes with software testbenches. They became determined to find a way to apply their Palladium XP platform to the problem.

Armed with the knowledge of the essential relationship between the analog operations and the logic and software operations, the team was able to craft models of the analog blocks using reduction techniques that accurately depicted the essence of the analog function required for hardware-software verification without the expense of a continuous time simulation engine.

The requirements boiled down to the following:

• Generation of digital signals with highly accurate and flexible timing

• Complete chip needs to run on Palladium XP platform

• Create high-resolution timing (100fs) with reasonable emulation performance, i.e. at least 50X faster than simulation on the fastest workstations

Solution Idea

The solution approach chosen was to simplify the functional model of the analog elements of the design down to generation of digital signal edges with high timing accuracy. The solution employed a fixed-frequency central clock that was used as a reference.Timing-critical analog signals used to produce accurately placed digital outputs were encoded into multi-bit representations that modeled the transition and timing behavior. A cell library was created that took the encoded signals and converted them to desired “regular signals”. 

Automation was added to the process by changing the netlisting to widen the analog signals according to user-specified schematic annotations. All of this was done in a fashion that is compatible with debugging in Cadence’s Simvision tool.  Details on all of these facets to follow.

The Timing Description Unit (TDU) Format

The innovative thinking that enabled the use of Palladium XP was the idea of combining a reference clock and quantized signal encoding to create offsets from the reference. The implementation of these ideas was done in a general manner so that different bit widths could easily be used to control the quantization accuracy.

 

Figure 2: Quantization method using signal encoding

 

Timed Cell Modeling

You might be thinking – timing and emulation, together..!?  Yes, and here’s a method to do it….

The engineering work in realizing the TDU idea involved the creation of a library of cells that could be used to compose the functions that convert the encoded signal into the “real signals” (timing-accurate digital output signals). Beyond some basic logic cells (e.g., INV, AND, OR, MUX, DFF, TFF, LATCH), some special cells such as window-latch, phase-detect, vernier-delay-line, and clock-generator were created. The converter functions were all composed from these basic cells. This approach ensured an easy path from design into emulation.

The solution was made parameterizable to handle varying needs for accuracy.  Single bit inputs need to be translated into transitions at offset zero or a high or low coding depending on the previous state.  Single bit outputs deliver the final state of the high-resolution output either at time zero, the next falling, or the next rising edge of the grid clock, selectable by parameter. Output transitions can optionally be filtered to conform to a configurable minimum pulse width.

Timed Cell Structure

There are four critical elements to the design of the conversion function blocks (time cells):

                Input conditioning – convert to zero-offset, optional glitch preservation, and multi-cycle path

                Transition sorting – sort transitions according to timing offset and specified precedence

                Function – for each input transition, create appropriate output transition

                Output filtering – Capability to optionally remove multiple transitions, zero-width, pulses, etc.

Timed Cell Caveat

All of the cells are combinational and deliver a result in the same cycle of an input transition. This holds for storage elements as well. For example a DFF will have a feedback to hold its state. Because feedback creates combinational loops, the loops need a designation to be broken (using a brk input conditioning function in this case – more on this later). This creates an additional requirement for flip-flop clock signals to be restricted to two edges per reference clock cycle.

Note that without minimum width filtering, the number of output transitions of logic gates is the sum of all input transitions (potentially lots of switching activity). Also note that the delay cell has the effect of doubling the number of output transitions per input transition.

 

Figure 3: Edge doubling will increase switching during execution

 

SimVision Debug Support

The debug process was set up to revolve around VCD file processing and directed and viewed within the SimVision debug tool. In order to understand what is going on from a functional standpoint, the raw simulation output processes the encoded signals so that they appear as high-precision timing signals in the waveform viewer. The flow is shown in the figure below.

 

Figure 4: Waveform post-processing flow

 

The result is the flow is a functional debug view that includes association across representations of the design and testbench, including those high-precision timing signals.

 

Figure 5: Simvision debug window setup

 

Overview of the Design Under Verification (DUV)

Verification has to prove that analog design works correctly together with the digital part. The critical elements to verify include:

• Programmable delay lines move data edges with sub-ps resolution

• PLL generates clocks with wide range of programmable frequency

• High-speed data stream at output of analog is correct

These goals can be achieved only if parts of the analog design are represented with fine resolution timing.

 

Figure 6: Mixed-signal design partitioning for verification

 

How to Get to a Verilog Model of the Analog Design

There was an existing Verilog cell library with basic building blocks that included:

- Gates, flip-flops, muxes, latches

- Behavioral models of programmable delay elements, PLL, loop filter, phase detector

With a traditional simulation approach, a cell-based netlist of the analog schematic is created. This netlist is integrated with the Verilog description of the digital design and can be simulated with a normal workstation. To use Palladium simulation, the (non-synthesizable) portions of the analog design that require fine resolution timing have to be replaced by digital timing representation. This modeling task is completed by using a combination of the existing Verilog cell library and the newly developed timed cells.

Loop Breaking

One of the chief characteristics of the timed cells is that they contain only combinational cells that propagate logic from inputs to outputs. Any feedback from a cell’s transitive fanout back to an input creates a combinational loop that must be broken to reach a steady state. Although the Palladium XP loop breaking algorithm works correctly, the timed cells provided a unique challenge that led to unpredictable results.  Thus, a process was developed to ensure predictable loop breaking behavior. The user input to the process was to provide a property at the loop origin that the netlister recognized and translated to the appropriate loop breaking directives.

Augmented Netlisting

Ease of use and flow automation were two primary considerations in creating a solution that could be deployed more broadly. That made creating a one-step netlisting process a high-value item. The signal point annotation and automatic hierarchy expansion of the “digital timing” parameter helped achieve that goal. The netlister was enriched to identify the key schematic annotations at any point in the hierarchy, including bit and bus signals.

Consistency checking and annotation reporting created a log useful in debugging and evolving the solution.

Wrapper Cell Modeling and Verification

The netlister generates a list of schematic instances at the designated “netlister stop level” for each instance the requires a Verilog model with fine resolution timing. For the design in this paper there were 160 such instances.

The library of timed cells was created; these cells were actually “wrapper” cells comprised of the primitives for timed cell modeling described above. A new verification flow was created that used the behavior of the primitive cells as a reference for the expected behavior of the composed cells. The testing of the composed cells included had the timing width parameter set to 1 to enable direct comparison to the primitive cells. The Cadence Incisive Enterprise Simullator tool was successfully employed to perform assertion-based verification of the composed cells versus the existing primitive cells.

Mapping and Long Paths

Initial experiments showed that inclusion of the fine resolution timed cells into the digital emulation environment would about double the required capacity per run. As previously pointed out, the timed cells having only combinational forward paths creates a loop issue. This fact also had the result of creating some such paths that were more than 5,000 steps of logic. A timed cell optimization process helped to solve this problem. The basic idea was to break the path up by adding flip-flops in strategic locations to reduce combinational path length. The reason that this is important is that the maximum achievable emulation speed is related to combinational path length.

Results

Once the flow was in place, and some realistic test cases were run through it, some further performance tuning opportunities were discovered to additionally reduce runtimes (e.g., Palladium XP tbrun mode was used to gain speed). The reference used for overall speed gains on this solution was versus a purely software-based solution on the highest performance workstation available.

The findings of the performance comparison were startlingly good:

• On Palladium XP, the simulation speed is 50X faster than on Advantest’s fastest workstation

• Software simulation running 25 days can now be run in 12 hours -> realistic runtime enables long-running tests that were not feasible before

• Now have 500 tests that execute once in more than 48 hours

• They can be run much more frequently using randomization and this will increase test coverage dramatically

Steve Carlson




mix

Five Reasons I'm Excited About Mixed-Signal Verification in 2015

Key Findings: Many more design teams will be reaching the mixed-signal methodology tipping point in 2015. That means you need to have a (verification) plan, and measure and execute against it.

As 2014 draws to a close, it is time to look ahead to the coming years and make a plan. While the macro view of the chip design world shows that is has been a mixed-signal world for a long time, it is has been primarily the digital teams that have rapidly evolved design and verification practices over the past decade. Well, I claim that is about to change. 2015 will be a watershed year for many more design teams because of the following factors:

  • 85% of designs are mixed signal, and it is going to stay that way (there is no turning back)
  • Advanced node drives new techniques, but they will be applied on all nodes
  • Equilibrium of mixed-signal designs being challenged, complexity raises risk level
  • Tipping point signs are evident and pervasive, things are going to change
  • The convergence of “big A” and “big D” demands true mixed-signal practices

Reason 1: Mixed-signal is dominant

To begin the examination of what is going to change and why, let’s start with what is not changing. IBS reports that mixed signal accounts for over 85% of chip design starts in 2014, and that percentage will rise, and hold steady at 85% in the coming years. It is a mixed-signal world and there is no turning back!

 

Figure 1. IBS: Mixed-signal design starts as percent of total

The foundational nature of mixed-signal designs in the semiconductor industry is well established. The reason it is exciting is that a stable foundation provides a platform for driving change. (It’s hard to drive on crumbling infrastructure.  If you’re from California, you know what I mean, between the potholes on the highways and the earthquakes and everything.)

Reason 2: Innovation in many directions, mostly mixed-signal applications

While the challenges being felt at the advanced nodes, such as double patterning and adoption of FinFET devices, have slowed some from following onto to nodes past 28nm, innovation has just turned in different directions. Applications for Internet of Things, automotive, and medical all have strong mixed-signal elements in their semiconductor content value proposition. What is critical to recognize is that many of the design techniques that were initially driven by advanced-node programs have merit across the spectrum of active semiconductor process technologies. For example, digitally controlled, calibrated, and compensated analog IP, along with power-reducing mutli-supply domains, power shut-off, and state retention are being applied in many programs on “legacy” nodes.

Another graph from IBS shows that the design starts at 45nm and below will continue to grow at a healthy pace.  The data also shows that nodes from 65nm and larger will continue to comprise a strong majority of the overall starts. 


Figure 2.  IBS: Design starts per process node

TSMC made a comprehensive announcement in September related to “wearables” and the Internet of Things. From their press release:

TSMC’s ultra-low power process lineup expands from the existing 0.18-micron extremely low leakage (0.18eLL) and 90-nanometer ultra low leakage (90uLL) nodes, and 16-nanometer FinFET technology, to new offerings of 55-nanometer ultra-low power (55ULP), 40ULP and 28ULP, which support processing speeds of up to 1.2GHz. The wide spectrum of ultra-low power processes from 0.18-micron to 16-nanometer FinFET is ideally suited for a variety of smart and power-efficient applications in the IoT and wearable device markets. Radio frequency and embedded Flash memory capabilities are also available in 0.18um to 40nm ultra-low power technologies, enabling system level integration for smaller form factors as well as facilitating wireless connections among IoT products.

Compared with their previous low-power generations, TSMC’s ultra-low power processes can further reduce operating voltages by 20% to 30% to lower both active power and standby power consumption and enable significant increases in battery life—by 2X to 10X—when much smaller batteries are demanded in IoT/wearable applications.

The focus on power is quite evident and this means that all of the power management and reduction techniques used in advanced node designs will be coming to legacy nodes soon.

Integration and miniaturization are being pursued from the system-level in, as well as from the process side. Techniques for power reduction and system energy efficiency are central to innovations under way.  For mixed-signal program teams, this means there is an added dimension of complexity in the verification task. If this dimension is not methodologically addressed, the level of risk adds a new dimension as well.

Reason 3: Trends are pushing the limits of established design practices

Risk is the bane of every engineer, but without risk there is no progress. And, sometimes the amount of risk is not something that can be controlled. Figure 3 shows some of the forces at work that cause design teams to undertake more risk than they would ideally like. With price and form factor as primary value elements in many growing markets, integration of analog front-end (AFE) with digital processing is becoming commonplace.  

 

Figure 3.  Trends pushing mixed-signal out of equilibrium

The move to the sweet spot of manufacturing at 28nm enables more integration, while providing excellent power and performance parameters with the best cost per transistor. Variation becomes great and harder to control. For analog design, this means more digital assistance for calibration and compensation. For greatest flexibility and resiliency, many will opt for embedding a microcontroller to perform the analog control functions in software. Finally, the first wave of leaders have already crossed the methodology bridge into true mixed-signal design and verification; those who do not follow are destined to fall farther behind.

Reason 4: The tipping point accelerants are catching fire

The factors cited in Reason 3 all have a technical grounding that serves to create pain in the chip-development process. The more factors that are present, the harder it is to ignore the pain and get the treatment relief  afforded by adopting known best practices for truly mixed-signal design (versus divide and conquer along analog and digital lines design).

In the past design performance was measured in MHz with simple static timing and power analysis. Design flows were conveniently partitioned, literally and figuratively, along analog and digital boundaries. Today, however, there are gigahertz digital signals that interact at the package and board level in analog-like ways. New, dynamic power analysis methods enabled by advanced library characterization must be melded into new design flows. These flows comprehend the growing amount of feedback between analog and digital functions that are becoming so interlocked as to be inseparable. This interlock necessitates design flows that include metrics-driven and software-driven testbenches, cross fabric analysis, electrically aware design, and database interoperability across analog and digital design environments.


Figure 4.  Tipping point indicators

Energy efficiency is a universal driver at this point.  Be it cost of ownership in the data center or battery life in a cell phone or wearable device, using less power creates more value in end products. However, layering multiple energy management and optimization techniques on top of complex mixed-signal designs adds yet more complexity demanding adoption of “modern” mixed-signal design practices.

Reason 5: Convergence of analog and digital design

Divide and conquer is always a powerful tool for complexity management.  However, as the number of interactions across the divide increase, the sub-optimality of those frontiers becomes more evident. Convergence is the name of the game.  Just as analog and digital elements of chips are converging, so will the industry practices associated with dealing with the converged world.


Figure 5. Convergence drivers

Truly mixed-signal design is a discipline that unites the analog and digital domains. That means that there is a common/shared data set (versus forcing a single cockpit or user model on everyone). 

In verification the modern saying is “start with the end in mind”. That means creating a formal approach to the plan of what will be test, how it will be tested, and metrics for success of the tests. Organizing the mechanics of testbench development using the Unified Verification Methodology (UVM) has proven benefits. The mixed-signal elements of SoC verification are not exempted from those benefits.

Competition is growing more fierce in the world for semiconductor design teams. Not being equipped with the best-known practices creates a competitive deficit that is hard to overcome with just hard work. As the landscape of IC content drives to a more energy-efficient mixed-signal nature, the mounting risk posed by old methodologies may cause causalities in the coming year. Better to move forward with haste and create a position of strength from which differentiation and excellence in execution can be forged.

Summary

2015 is going to be a banner year for mixed-signal design and verification methodologies. Those that have forged ahead are in a position of execution advantage. Those that have not will be scrambling to catch up, but with the benefits of following a path that has been proven by many market leaders.



  • uvm
  • mixed signal design
  • Metric-Driven-Verification
  • Mixed Signal Verification
  • MDV-UVM-MS

mix

Top 5 Issues that Make Things Go Wrong in Mixed-Signal Verification

Key Findings:  There are a host of issues that arise in mixed-signal verification.  As discussed in earlier blogs, the industry trends indicate that teams need to prepare themselves for a more mixed world.  The good news is that these top five pitfalls are all avoidable.

It’s always interesting to study the human condition.  Watching the world through the lens of mixed-signal verification brings an interesting microcosm into focus.  The top 5 items that I regularly see vexing teams are:

  1. When there’s a bug, whose problem is it?
  2. Verification team is the lightning rod
  3. Three (conflicting) points of view
  4. Wait, there’s more… software
  5. There’s a whole new language

Reason 1: When there’s a bug, whose problem is it?

It actually turns out to be a good thing when a bug is found during the design process.  Much, much better than when the silicon arrives back from the foundry of course. Whether by sheer luck, or a structured approach to verification, sometimes a bug gets discovered. The trouble in mixed-signal design occurs when that bug is near the boundary of an analog and a digital domain.


Figure 1.   Whose bug is it?

Typically designers are a diligent sort and make sure that their block works as desired. However, when things go wrong during integration, it is usually also project crunch time. So, it has to be the other guy’s bug, right?

A step in the right direction is to have a third party, a mixed-signal verification expert, apply rigorous methods to the mixed-signal verification task.  But, that leads to number 2 on my list.

 

Reason 2: Verification team is the lightning rod

Having a dedicated verification team with mixed-signal expertise is a great start, but what can typically happen is that team is hampered by the lack of availability of a fast executing model of the analog behavior (best practice today being a SystemVerilog real number model – SV_RNM). That model is critical because it enables orders of magnitude more tests to be run against the design in the same timeframe. 

Without that model, there will be a testing deficit. So, when the bugs come in, it is easy for everyone to point their finger at the verification team.


Figure 2.  It’s the verification team’s fault

Yes, the model creates a new validation task – it’s validation – but the speed-up enabled by the model more than compensates in terms of functional coverage and schedule.

The postscript on this finger-pointing is the institutionalization of SV-RNM. And, of course, the verification team gets its turn.


Figure 3.  Verification team’s revenge

 

Reason 3: Three (conflicting) points of view

The third common issue arises when the finger-pointing settles down. There is still a delineation of responsibility that is often not easy to achieve when designs of a truly mixed-signal nature are being undertaken.  


Figure 4.  Points of view and roles

Figure 4 outlines some of the delegated responsibility, but notice that everyone is still potentially on the hook to create a model. It is questions of purpose, expertise, bandwidth, and convention that go into the decision about who will “own” each model. It is not uncommon for the modeling task to be a collaborative effort where the expertise on analog behavior comes from the analog team, while the verification team ensures that the model is constructed in such a manner that it will fit seamlessly into the overall chip verification. Less commonly, the digital design team does the modeling simply to enable the verification of their own work.

Reason 4: Wait, there’s more… software

As if verifying the function of a chip was not hard enough, there is a clear trend towards product offerings that include software along with the chip. In the mixed-signal design realm, many times this software has among its functions things like calibration and compensation that provide a flexible way of delivering guards against parameter drift. When the combination of the chip and the software are the product, they need to be verified together. This puts an enormous premium on fast executing SV-RNM.

 


Figure 5.   There’s software analog and digital

While the added dimension of software to the verification task creates new heights of complexity, it also serves as a very strong driver to get everyone aligned and motivated to adopt best known practices for mixed-signal verification.  This is an opportunity to show superior ability!


Figure 6.  Change in perspective, with the right methodology

 

Reason 5: There’s a whole new language

Communication is of vital importance in a multi-faceted, multi-team program.  Time zones, cultures, and personalities aside, mixed-signal verification needs to be a collaborative effort.  Terminology can be a big stumbling block in getting to a common understanding. If we take a look at the key areas where significant improvement can usually be made, we can start to see the breadth of knowledge that is required to “get” the entirety of the picture:

  • Structure – Verification planning and management
  • Methodology – UVM (Unified Verification Methodology – Accellera Standard)
  • Measure – MDV (Metrics-driven verification)
  • Multi-engine – Software, emulation, FPGA proto, formal, static, VIP
  • Modeling – SystemVerilog (discrete time) down to SPICE (continuous time)
  • Languages – SystemVerilog, Verilog, Verilog-AMS, VHDL, SPICE, PSL, CPF, UPF

Each of these areas has its own jumble of terminology and acronyms. It never hurts to create a team glossary to start with. Heck, I often get my LDO, IFV, and UDT all mixed up myself.

Summary

Yes, there are a lot of things that make it hard for the humans involved in the process of mixed-signal design and verification, but there is a lot that can be improved once the pain is felt (no pain, no gain is akin to no bugs, no verification methodology change). If we take a look at the key areas from the previous section, we can put a different lens on them and describe the value that they bring:

  • Structure – Uniformly organized, auditable, predictable, transparency
  • Methodology – Reusable, productive, portable, industry standard
  • Measure – Quantified progress, risk/quality management, precise goals
  • Multi-engine – Faster execution, improved schedule, enables new quality level
  • Modeling – Enabler, flexible, adaptable for diverse applications/design styles
  • Languages – Flexible, complete, robust, standard, scalability to best practices

With all of this value firmly in hand, we can turn our thoughts to happier words:

…  stay tuned for more!

 

 Steve Carlson




mix

Verifying Power Intent in Analog and Mixed-Signal Designs Using Formal Methods

Analog and Mixed-signal (AMS) designs are increasingly using active power management to minimize power consumption. Typical mixed-signal design uses several power domains and operate in a dozen or more power modes including multiple functional, standby and test modes. To save power, parts of design not active in a mode are shut down or may operate at reduced supply voltage when high performance is not required. These and other low power techniques are applied on both analog and digital parts of the design. Digital designers capture power intent in standard formats like Common Power Format (CPF), IEEE1801 (aka Unified Power Format or UPF) or Liberty and apply it top-down throughout design, verification and implementation flows. Analog parts are often designed bottom-up in schematic without upfront defined power intent. Verifying that low power intent is implemented correctly in mixed-signal design is very challenging. If not discovered early, errors like wrongly connected power nets, missing level shifters or isolations cells can cause costly rework or even silicon re-spin. 

Mixed-signal designers rely on simulation for functional verification. Although still necessary for electrical and performance verification, running simulation on so many power modes is not an effective verification method to discover low power errors. It would be nice to augment simulation with formal low power verification but a specification of power intent for analog/mixed-signal blocs is missing. So how do we obtain it? Can we “extract” it from already built analog circuit? Fortunately, yes we can, and we will describe an automated way to do so!

Virtuoso Power Manager is new tool released in the Virtuoso IC6.1.8 platform which is capable of managing power intent in an Analog/MS design which is captured in Virtuoso Schematic Editor. In setup phase, the user identifies power and ground nets and registers special devices like level shifters and isolation cells. The user has the option to import power intent into IEEE1801 format, applicable for top level or any of the blocks in design. Virtuoso Power Manager uses this information to traverse the schematic and extract complete power intent for the entire design. In the final stage, Virtuoso Power Manager exports the power intent in IEEE1801 format as an input to the formal verification tool (Cadence Conformal-LP) for static verification of power intent.

Cadence and Infineon have been collaborating on the requirements and validation of the Virtuoso Power Manager tool and Low Power verification solution on real designs. A summary of collaboration results were presented at the DVCon conference in Munich, in October of 2018.  Please look for the paper in the conference proceedings for more details. Alternately, can view our Cadence webinar on Verifying Low-Power Intent in Mixed-Signal Design Using Formal Method for more information.





mix

Frankfurt (Oder) looks to get the incentives mix right

The federal state of Brandenburg is committed to ensuring investors are welcomed into Frankfurt (Oder) through a string of generous incentives.




mix

Antwerp's vice-mayor mixes history and innovation

Antwerp’s vice-mayor, Claude Marinower, talks about the city's history as a diverse business location, and its plans to promote traditional industries alongside innovative concepts.




mix

Why mixing wine with tourism could pay off for Moldova

Moldova's wine industry has gained some international recognition but the country remains largely untroubled by tourists, a combination that is enticing some foreign investors.




mix

Stanford researchers develop technology to harness energy from mixing of freshwater and seawater

A new battery made from affordable and durable materials generates energy from places where salt and fresh waters mingle. The technology could make coastal wastewater treatment plants energy-independent and carbon neutral.




mix

Brazil plans to add more solar to its hydro-dominated electricity generation mix

Brazil is the second-largest producer of hydroelectric power in the world, after only China, and hydropower accounted for more than 70% of the country’s electricity generation in 2018. Brazil’s latest 10-year energy plan seeks to maintain this level of hydro generation while increasing the share of nonhydro renewables, particularly solar.




mix

NV Energy Reorients Generation Mix toward Solar, Retiring Coal

The Public Utility Commission of Nevada has approved NV’s long-term IRP to double its renewable energy capacity by 2023. The utility will bring 1,001 MW of solar capacity online via six new power purchase agreements (PPAs).




mix

Renewables Beat Coal in Germany Power Mix for First Time

Renewable energy muscled out coal to become Germany’s biggest source of electricity for the first time last year, helped by a surge in solar panel installations and coal-plant closures.




mix

Brazil plans to add more solar to its hydro-dominated electricity generation mix

Brazil is the second-largest producer of hydroelectric power in the world, after only China, and hydropower accounted for more than 70% of the country’s electricity generation in 2018. Brazil’s latest 10-year energy plan seeks to maintain this level of hydro generation while increasing the share of nonhydro renewables, particularly solar.




mix

Stanford researchers develop technology to harness energy from mixing of freshwater and seawater

A new battery made from affordable and durable materials generates energy from places where salt and fresh waters mingle. The technology could make coastal wastewater treatment plants energy-independent and carbon neutral.




mix

Papua New Guinea Election Observers Find Mixture of Improvements and Irregularities

HONOLULU (July 19, 2012) – An international delegation of election observers supported by the U.S. Department of State and organized by the East-West Center’s Pacific Islands Development Program (PIDP) monitored the recent month-long general elections in Papua New Guinea. The mission included election specialists, former government officials, academics and civil society representatives from Australia, the Federated States of Micronesia, Solomon Islands and the United States. The delegation was headed by Dr. Gerard A. Finin, Co-Director of the PIDP.




mix

Consulting tenants of mixed-use properties on service charge expenditure

Helpful analysis of when landlords of mixed–use properties need to consult their tenants before incurring service charge costs. The Landlord and Tenant Act 1985 requires landlords who own residential blocks of flats to consult their tenants b...




mix

Japan and South Korea: Two "Like-Minded" States Have Mixed Views on Conflicts in the South China Sea

By Rebecca Strating HONOLULU (24 April 2020)—Many argue that China’s increasingly aggressive posture in the South China Sea is an attempt to unilaterally alter the US-led regional order, which includes an emphasis on freedom of navigation. In response, the US has stressed the importance of support from “like-minded” states—including Japan and South Korea—in defending freedom of navigation in the South China Sea and elsewhere. This characterization, however, disguises important differences in attitudes and behavior that could hinder joint efforts to push back against China.

This is a summary only. Click the title for the full article, or visit www.EastWestCenter.org/Research-Wire for more.




mix

Mixed Fortunes of LG Group and Hyundai Group in Terms of Market Value

The market value of LG Group companies has moved up to the third place after overtaking that of Hyundai Motor Group following Samsung and SK. Hyundai Motor lost out to LG after giving away the second place to SK Group in June this year. According to the Korea Exchange on September 6, the aggregate market capitalization (including that of preferred stocks) of LG Group's 16 publicly listed affiliates was 96,883 billion won as of the end of September 5. On the same day, the value of 16 publicly ...




mix

Dentons advises Danescroft Land on acquisition of mixed-use site in St Leonards-on-Sea for development

Dentons has advised real estate developer Danescroft Land Limited, as part of a joint venture with Bridges Fund Management, on the acquisition of Ashdown House in Hastings, St Leonards-on-Sea for future redevelopment.




mix

Secunderabad Cantonment Board to soon have mixed land use zones

HYDERABAD: The Secunderabad Cantonment area may soon have mixed land use zones for the overall development of its civilian areas. This is being proposed as part of the master plan for the area to be implemented between 2014 and 2021 and it will allow usage of land for all purposes like residential, commercial and recreational. As of now, there is no master plan for the Secunderabad Cantonment area and the civilian areas have been developing in a haphazard manner. With an intention to chalk up a master plan along with a development strategy plan and an investment plan, the Secunderabad Cantonment Board (SCB) had recently approached the Hyderabad Metropolitan Development […]




mix

Fast Algorithms for Conducting Large-Scale GWAS of Age-at-Onset Traits Using Cox Mixed-Effects Models [Statistical Genetics and Genomics]

Age-at-onset is one of the critical traits in cohort studies of age-related diseases. Large-scale genome-wide association studies (GWAS) of age-at-onset traits can provide more insights into genetic effects on disease progression and transitions between stages. Moreover, proportional hazards (or Cox) regression models can achieve higher statistical power in a cohort study than a case-control trait using logistic regression. Although mixed-effects models are widely used in GWAS to correct for sample dependence, application of Cox mixed-effects models (CMEMs) to large-scale GWAS is so far hindered by intractable computational cost. In this work, we propose COXMEG, an efficient R package for conducting GWAS of age-at-onset traits using CMEMs. COXMEG introduces fast estimation algorithms for general sparse relatedness matrices including, but not limited to, block-diagonal pedigree-based matrices. COXMEG also introduces a fast and powerful score test for dense relatedness matrices, accounting for both population stratification and family structure. In addition, COXMEG generalizes existing algorithms to support positive semidefinite relatedness matrices, which are common in twin and family studies. Our simulation studies suggest that COXMEG, depending on the structure of the relatedness matrix, is orders of magnitude computationally more efficient than coxme and coxph with frailty for GWAS. We found that using sparse approximation of relatedness matrices yielded highly comparable results in controlling false-positive rate and retaining statistical power for an ethnically homogeneous family-based sample. By applying COXMEG to a study of Alzheimer’s disease (AD) with a Late-Onset Alzheimer’s Disease Family Study from the National Institute on Aging sample comprising 3456 non-Hispanic whites and 287 African Americans, we identified the APOE 4 variant with strong statistical power (P = 1e–101), far more significant than that reported in a previous study using a transformed variable and a marginal Cox model. Furthermore, we identified novel SNP rs36051450 (P = 2e–9) near GRAMD1B, the minor allele of which significantly reduced the hazards of AD in both genders. These results demonstrated that COXMEG greatly facilitates the application of CMEMs in GWAS of age-at-onset traits.




mix

Sedimentary and tectonic controls on Lower Carboniferous (Visean) mixed carbonate-siliciclastic deposition in NE England and the Southern North Sea: implications for reservoir architecture

Discovery of the Breagh gas field in the Southern North Sea (SNS) has demonstrated the potential that the Lower Carboniferous (Visean, 346.7–330.9 Ma) Farne Group reservoirs have to contribute to the UK's future energy mix. New biostratigraphic correlations provide a basis to compare Asbian and Brigantian sedimentary cores from the Breagh Field and age-equivalent sediments exposed on the Northumberland Coast, which has proved critical in gaining an understanding of exploration and development opportunities. Thirteen facies associations characterize the mixed carbonate–siliciclastic system, grouped into: marine, delta front, delta shoreface, lower delta plain and upper delta plain gross depositional environments. The facies associations are interpreted as depositing in a mixed carbonate and siliciclastic fluvio-deltaic environment, and are arranged into coarsening- and cleaning-upward cycles (parasequences) bounded by flooding surfaces. Most cycles are characterized by mouth bars, distributary channels, interdistributary bays and common braided rivers, interpreted as river-dominated deltaic deposits. Some cycles include rare shoreface and tidally-influenced deposits, interpreted as river-dominated and wave- or tide-influenced deltaic deposits. The depositional processes that formed each cycle have important implications for the reservoir net/gross ratio (where this ratio indicates the proportion of sandstone beds in a cycle), thickness and lateral extent. The deltaic deposits were controlled by a combination of tectonic and eustatic (allocyclic) events and delta avulsion (autocyclic) processes, and are likely to reflect a changing tectonic regime, from extension within elongate fault-bounded basins (synrift) to passive regional thermal subsidence (post-rift). Deep incision by the Base Permian Unconformity across the Breagh Field has removed the Westphalian, Namurian and upper Visean, to leave the more prospective thicker clastic reservoirs within closure.

Thematic collection: This article is part of the Under-explored plays and frontier basins of the UK continental shelf collection available at: https://www.lyellcollection.org/cc/under-explored-plays-and-frontier-basins-of-the-uk-continental-shelf




mix

Meningioma 1 is indispensable for mixed lineage leukemia-rearranged acute myeloid leukemia

Mixed lineage leukemia (MLL/KMT2A) rearrangements (MLL-r) are one of the most frequent chromosomal aberrations in acute myeloid leukemia. We evaluated the function of Meningioma 1 (MN1), a co-factor of HOXA9 and MEIS1, in human and murine MLL-rearranged leukemia by CRISPR-Cas9 mediated deletion of MN1. MN1 was required for in vivo leukemogenicity of MLL positive murine and human leukemia cells. Loss of MN1 inhibited cell cycle and proliferation, promoted apoptosis and induced differentiation of MLL-rearranged cells. Expression analysis and chromatin immunoprecipitation with sequencing from previously reported data sets demonstrated that MN1 primarily maintains active transcription of HOXA9 and HOXA10, which are critical downstream genes of MLL, and their target genes like BCL2, MCL1 and Survivin. Treatment of MLL-rearranged primary leukemia cells with anti-MN1 siRNA significantly reduced their clonogenic potential in contrast to normal CD34+ hematopoietic progenitor cells, suggesting a therapeutic window for MN1 targeting. In summary, our findings demonstrate that MN1 plays an essential role in MLL fusion leukemias and serve as a therapeutic target in MLL-rearranged acute myeloid leukemia.




mix

Extensive multilineage analysis in patients with mixed chimerism after allogeneic transplantation for sickle cell disease: insight into hematopoiesis and engraftment thresholds for gene therapy

Although studies of mixed chimerism following hematopoietic stem cell transplantation in patients with sickle cell disease (SCD) may provide insights into the engraftment needed to correct the disease and into immunological reconstitution, an extensive multilineage analysis is lacking. We analyzed chimerism simultaneously in peripheral erythroid and granulomonocytic precursors/progenitors, highly purified B and T lymphocytes, monocytes, granulocytes and red blood cells (RBC). Thirty-four patients with mixed chimerism and ≥12 months of follow-up were included. A selective advantage of donor RBC and their progenitors/precursors led to full chimerism in mature RBC (despite partial engraftment of other lineages), and resulted in the clinical control of the disease. Six patients with donor chimerism <50% had hemolysis (reticulocytosis) and higher HbS than their donor. Four of them had donor chimerism <30%, including a patient with AA donor (hemoglobin >10 g/dL) and three with AS donors (hemoglobin <10 g/dL). However, only one vaso-occlusive crisis occurred with 68.7% HbS. Except in the patients with the lowest chimerism, the donor engraftment was lower for T cells than for the other lineages. In a context of mixed chimerism after hematopoietic stem cell transplantation for SCD, myeloid (rather than T cell) engraftment was the key efficacy criterion. Results show that myeloid chimerism as low as 30% was sufficient to prevent a vaso-occlusive crisis in transplants from an AA donor but not constantly from an AS donor. However, the correction of hemolysis requires higher donor chimerism levels (i.e. ≥50%) in both AA and AS recipients. In the future, this group of patients may need a different therapeutic approach.




mix

Women&#x2019;s experiences of diagnosis and management of polycystic ovary syndrome: a mixed-methods study in general practice

BackgroundPolycystic ovary syndrome (PCOS) is a common lifelong metabolic condition with serious associated comorbidities. Evidence points to a delay in diagnosis and inconsistency in the information provided to women with PCOS.AimTo capture women’s experiences of how PCOS is diagnosed and managed in UK general practice.Design and settingThis was a mixed-methods study with an online questionnaire survey and semi-structured telephone interviews with a subset of responders.MethodAn online survey to elicit women’s experiences of general practice PCOS care was promoted by charities and BBC Radio Leicester. The survey was accessible online between January 2018 and November 2018. A subset of responders undertook a semi-structured telephone interview to provide more in-depth data.ResultsA total of 323 women completed the survey (average age 35.4 years) and semi-structured interviews were conducted with 11 women. There were five key themes identified through the survey responses. Participants described a variable lag time from presentation to PCOS diagnosis, with a median of 6–12 months. Many had experienced mental health problems associated with their PCOS symptoms, but had not discussed these with the GP. Many were unable to recall any discussion about associated comorbidities with the GP. Some differences were identified between the experiences of women from white British backgrounds and those from other ethnic backgrounds.ConclusionFrom the experiences of the women in this study, it appears that PCOS in general practice is not viewed as a long-term condition with an increased risk of comorbidities including mental health problems. Further research should explore GPs’ awareness of comorbidities and the differences in PCOS care experienced by women from different ethnic backgrounds.