2015

Indian economy to grow by 6.4% in 2015-16

Led by India, South Asia Economic Growth to Accelerate - World Bank report




2015

India e-commerce market will reach $6 billion by 2015 - Gartner

India fastest growing e-commerce market in Asia-Pacific region will reach $6 billion by 2015 - Gartner




2015

E-commerce - Online spending to increase by 67% in 2015

E-commerce to Boom: Online spending to increase by 67% in 2015 - ASSOCHAM-PwC




2015

Indian Union Budget 2015-16

No Change in tax rates but some Benefits to Middle Class Tax Payers




2015

IDFC Bank will start banking operation from Oct 1, 2015

IDFC Bank will start banking operation from Oct 1, 2015




2015

Zoetis Sponsors 2015 Veterinary Wellness & Social Work Summit

For the third year, Zoetis proudly served as the lead sponsor of the Veterinary Wellness & Social Work Summit (VWSWS).




2015

Zoetis to Host Webcast and Conference Call on Fourth Quarter and Full Year 2015 Financial Results




2015

Zoetis Reports Fourth Quarter and Full Year 2015 Results




2015

DAC 2015 Cadence Theater – Learn from Customers and Partners

One reason for attending the upcoming Design Automation Conference (DAC 2015) is to learn about challenges other engineers have faced, and hear about their solutions. And the best place to do that is the Cadence Theater, located at the Cadence booth (#3515). The Theater will host continuous half-hour customer and partner presentations from 10:00 am Monday, June 8, to 5:30 pm Wednesday June 4.

As of this writing, 43 presentations are scheduled. This includes 17 customer presentations, 23 partner presentations, and 3 Cadence presentations, The presentations are open to all DAC attendees and no reservations are required.

Cadence customers who will be speaking include engineers from AMD, ams, Allegro Micro, Broadcom, IBM, Netspeed, NVidia, Renesas, Socionet, and STMicroelectronics. Partner presentations will be provided by ARM, Cliosoft, Dini Group, GLOBALFOUNDRIES, Methodics, Methods2Business, National Instruments, Samsung, TowerJazz, TSMC, and X-Fab.

These informal presentations are given in an interactive setting with an opportunity for questions and answers. Audio recordings with slides will be available at the Cadence web site after DAC. To access recordings of the 2014 DAC Theater presentations, click here.

 

This Cadence DAC Theater presentation drew a large audience at DAC 2015

Here’s a listing of the currently scheduled Cadence DAC Theater presentations. The latest schedule is available at the Cadence DAC 2015 site.

Monday, June 8

 

Tuesday, June 9

 

Wednesday, June 10

 

In a Wednesday session (June 10, 10:00 am) at the theater, the Cadence Academic Network will sponsor three talks on academic/industry collaboration models. Speakers are Dr. Zhou Li, architect, Cadence; Prof. Xin Li, Carnegie-Mellon University; and Prof. Laleh Behjat, University of Calgary.

As shown above, there will be a giveaways for a set of Bose noise-cancelling headphones, an iPad Mini, and a GoPro Hero3 video camera.

See the Cadence Theater schedule for further details. And be sure to view our Multimedia Site for live blogging and photos and videos from DAC. For a complete overview of Cadence activities at DAC, see our DAC microsite.

Richard Goering

Related Blog Posts

DAC 2015: See the Latest in Semiconductor IP at “IPTalks!”

Cadence DAC 2015 and Denali Party Update

DAC 2015: Tackling Tough Design Problems Head On




2015

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions

First, the good news. The EDA industry will grow from $6.2 billion in 2015 to $9.0 billion in 2019, according to Gary Smith, chief analyst at Gary Smith EDA. Year-to-year growth rates will range from +4% to +11.2%.

But in his annual presentation on the eve of the Design Automation Conference (DAC 2015), Smith noted that Wall Street is unimpressed. “The people I talk to want long-term steady growth, no sharp up-turns, no sharp downturns,” Smith said. “To the rest of Wall Street, we’re boring.”

Smith spent the rest of his talk noting how EDA can be a lot less boring and, potentially, a whole lot bigger. For starters, what if we add semiconductor IP to EDA revenues? Now we’re looking at $12.2 billion in revenue by 2019, Smith said. (He acknowledged, however, that the IP market itself is going to take a “dip” due to the move towards platform-based IP and away from conventional piecemeal IP).

This still is not enough to get Wall Street’s attention. Another possibility is to bring embedded software development into the EDA industry. This is not a huge market – about $2.6 billion today – but it is an “easy growth market for us,” according to Smith.

Chasing the Big Bucks

But the “big bucks” are in mechanical CAD (MCAD), Smith said. In the past the MCAD market has always been bigger than EDA, but now EDA is catching up. The MCAD market is about $6.6 billion now. Synopsys and Cadence are larger than PTC and Siemens, two of the main players in MCAD.

There may be some good acquisition possibilities coming up for EDA vendors, Smith said – and if we don’t buy MCAD companies, they might buy EDA companies. Consider, for example, that Ansoft bought Apache and Dassault bought Synchronicity. (Note: Siemens PLM Software is a first-time exhibitor at DAC 2015).

What about other domains? Smith said that EDA companies could conceivably move into optical design, applications development software, biomedical design, and chemical design. The last if these is probably the most tenuous; Smith noted that EDA vendors have yet to look into chemical design.

Applications development software is the biggest market on the above list, but that means competing with Microsoft, IBM, and Oracle. “You’re in with the big boys – is that a good idea?” Smith asked.

Perhaps there’s an opening for a “big play” for an MCAD provider. Smith noted that mechanical vendors are focusing on product data management (PDM). This “is really the IT of design,” Smith said. “They have a lot of hope that the IoT [Internet of things] market is going to give them an opportunity to capture the software that goes from the ground to the cloud. Maybe we can let them have PDM and see if we can take the tool market away from them, or acquire it away from them.”

In conclusion, Smith asked, should the EDA industry accelerate its growth? “The mechanical vendors have already shown interest in acquiring EDA vendors,” he said. “We may not have a choice.”

Richard Goering

NOTE: Catch our live blog from DAC 2015, beginning Monday morning, June 8! Click here

 

 

 




2015

DAC 2015: Google Smart Contact Lens Project Stretches Limits of IC Design

There has been so much hype about the “Internet of Things” (IoT) that it is refreshing to hear about a cutting-edge development project that can bring concrete benefits to millions of people. That project is the ongoing development of the Google Smart Contact Lens, and it was detailed in a keynote speech June 8 at the Design Automation Conference (DAC 2015).

The keynote speech was given by Brian Otis (right), a director at Google and a research associate professor at the University of Washington. The “smart lens” that the project envisions is essentially a disposable contact lens that fits on an eye and continuously monitors blood glucose levels. This is valuable information for anyone who has, or may someday have, diabetes.

Since he was speaking to an engineering audience, Otis focused on the challenges behind building such a device, and described some of the strategies taken by Google and its partner, Novartis. The project required new approaches to miniaturization, low-power design, and connectivity, as well as a comfortable and reliable silicon-to-human interface. Otis discussed the “why” as well and showed how the device could potentially save or improve millions of lives.

Millions of Users

First, a bit of background. Google announced the smart lens project in a blog post in January 2014. Since then it has been featured in news outlets including Forbes, Time, and the Wall Street Journal. In March 2015, Time reported that Google has been granted a patent for a smart contact lens.

The smart lens monitors the level of blood glucose by looking at its concentration in tears. The lens includes a wireless system on chip (SoC) and a miniaturized glucose sensor. A tiny pinhole in the lens allows tear fluid to seep into the sensor, and a wireless antenna handles communications to the wireless devices.

“We figure that if we can solve a huge problem, it is probably worth doing,” Otis said. “Diabetes is one example.” He noted 382 million people worldwide have diabetes today, and that 35% of the U.S. population may be pre-diabetic. Today, diabetics must *** their fingers to test blood glucose levels, a procedure that is invasive, painful, and subject to infrequent monitoring.

According to Otis, the smart contact lens represents a “new category of wearable devices that are comfortable, inexpensive, and empowering.” The lens does sensor data logging and uses a portable instrument to measure glucose levels. It is thin, cheap, and disposable, he said.

Moreover, the lens is not just for people already diagnosed with diabetes—it’s for anyone who is pre-diabetic, or may be at risk due to genetic predisposition. “If we are pro-active rather than re-active,” Otis said, “Instead of waiting until a person has full-fledged diabetes, we could make a huge difference in peoples’ lives and lower the costs of treating them.”

Technical Challenges

No one has built anything quite like the smart lens, so researchers at Google and Novartis are treading new ground. Otis identified three key challenges:

  • Miniaturization: Everything must be really small—the SoC, the passive components, the power supply. Components must be flexible and cheap, and support thin-film integration.
  • Platform: Google has developed a reusable platform that includes tiny, always-on wireless sensors, ultra low-power components, and standards-based interfaces.
  • Data: Researchers are looking for the best ways to get the resulting data into a mobile device and onto the cloud.

Comfort is another concern. “This is not intended to be for the most severe cases,” Otis said. “This is intended to be for all of us as a pro-active way of improving our lifestyles.”

The platform provides a bidirectional encrypted wireless link, integrated power management, on-chip memory, standards-based RFID link, flexible sensor interface, high-resolution potentiostat sensor, and decoupling capacitors. Most of these capabilities are provided by the standard CMOS SoC, which is a couple hundred microns on a side and only “tens of microns” thick.

Otis noted that unpackaged ICs are typically 250 microns thick when they come back from the foundry. Thus, post-processing is needed so the IC will fit into a contact lens.

Furthermore, the design requires precision analog circuitry and additional environmental sensors. “Some of this stuff sounds mundane but it is really hard, especially when you find out you can’t throw large decoupling capacitors and bypass capacitors onto a board, and all that has to be re-integrated into the chip,” Otis said.

Sensor Challenges

Getting information from the human body is challenging. The smart lens sensor does a direct chemical measurement on the surface of the eye. The sensor is designed to work with very low glucose concentrations. This is because the concentration of glucose in tears is an order of magnitude lower than it is in blood.

In brief, the sensor has two parallel plates that are coated with an enzyme that converts glucose into hydrogen peroxide, which flows around the electrodes of the sensor. This is actually a fairly standard way of doing glucose monitoring. However, the smart lens sensor has two electrodes compared to the typical three.

In manufacturing, it is essential to keep costs low. Otis outlined a three-step manufacturing process:

  • Start with the bottom layer, and mold a contact lens in the way you typically would.
  • Add the electronics package on top of that layer.
  • Build a second layer that encapsulates the electronics and provides the curvature needed for comfort and vision correction.

Beyond the technical challenges are the “clinical” challenges of working with human beings. The human body “is messy and very variable,” Otis said. This variability affects sensor performance and calibration, RF/electro-magnetic performance, system reliability, and comfort.

The final step is making use of the data. “We need to get the data from the device into a phone, and then display it so users can visualize the data,” Otis said. This provides “actionable feedback” to the person who needs it. Eventually, the data will need to be stored in the cloud.

As he concluded his talk, Otis noted that the platform his group developed may have many applications beyond glucose monitoring. “There is a lot you can do with a bunch of logic and sensing capability,” he said, “and there are hundreds of biomarkers beyond glucose.” Clearly this will be an interesting technology to watch.

Richard Goering

Related Blog Post

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions




2015

DAC 2015: Lip-Bu Tan, Cadence CEO, Sees Profound Changes in Semiconductors and EDA

As a leading venture capitalist in the electronics technology, as well as CEO of Cadence, Lip-Bu Tan has unique insights into ongoing changes that will impact EDA providers and users. Tan shared some of those insights in a “fireside chat” with Ed Sperling, editor in chief of Semiconductor Engineering, at the Design Automation Conference (DAC 2015) on June 9.

Topics of this discussion included industry consolidation, the need for more talent and more startups, Internet of Things (IoT) opportunities and challenges, the shift from ICs to full product development, and the challenges of advanced nodes. Following are some excerpts from this conversation, held at the DAC Pavilion theater on the exhibit floor.

 

Ed Sperling (left) and Lip-Bu Tan (right) discuss trends in semiconductors and EDA

Q: As you look out over the semiconductor and EDA industries these days, what worries you most?

Tan: At the top of my list is all the consolidation that is going on. Secondly, chip design complexity is increasing substantially. Time-to-market pressure is growing and advanced nodes have challenges.

The other thing I worry about is that we need to have more startups. There’s a lot of innovation that needs to happen. And this industry needs more top talent. At Cadence, we have a program to recruit over 10% of new hires every year from college graduates. We need new blood and new ideas.

Q: EDA vendors were acquiring companies for many years, but now the startups are pretty much gone. Where does the next wave of innovation come from?

Tan: I’ve been an EDA CEO for the last seven years and I really enjoy it because so much innovation is needed. System providers have very big challenges and very different needs. You have to find the opportunities and go out and provide the solutions.

The opportunities are not just in basic tools. Massive parallelism is critical, and the power challenge is huge. Time to market is critical, and for the IoT companies, cost is going to be critical. If you want to take on some good engineering challenges, this is the most exciting time.

Q: You live two lives—you’re a CEO but you’re also an investor. Where are the investments going these days and where are we likely to see new startups?

Tan: Clearly everybody is chasing the IoT. There is a lot of opportunity in the cloud, in the data center. Also, I’m a big believer in video, so I back companies that are video related. A big area is automotive. ADAS [Advanced Driver Assistance Systems] is a tremendous opportunity.

These companies can help us understand how the industry is transforming, and then we can provide solutions, either in terms of IP, tools, or the PCB. Then we need to connect from the system level down to semiconductors. I think it’s a different way to design.

Q: What happens as we start moving from companies looking to design a semiconductor to system companies who are doing things from the perspective that we have this purpose for our software?

Tan: We are extending from EDA to what we call system design enablement, and we are becoming more application driven. The application at the system level will drive the silicon design. We need to help companies look at the whole system including the power envelope and signal integrity. You don’t want to be in a position where you design a chip all the way to fabrication and then find the power is too high.

We help the customers with hardware/software co-design and co-verification. We have a design suite and a verification suite that can provide customers with high-level abstractions, as well as verify IP blocks at the system level. Then we can break things down to the component level with system constraints in mind, and drive power-aware, system-aware design.

We are starting to move into vertical markets. For example, medical is a tremendous opportunity.

Q: How does this approach change what you provide to customers?

Tan: Every year I spend time meeting with customers. I think it is very important to understand what they are trying to design, and it is also important to know the customer’s customer requirements. We might say, “Wait a minute, for this design you may want to think about power or the library you’re using.” We help them understand what foundry they should use and what process they should use. They don’t view me as a vendorthey view me as a partner.

We also work very closely with our IP and foundry partners. We work as one teamthe ultimate goal is customer success.

Q: Is everybody going to say, FinFETs are beautiful, we’re going to go down to 10nm or 7nmor is it a smaller number of companies who will continue down that path?

Tan: Some of the analog/mixed-signal companies don’t need to go that far. We love those customerswe have close to 50% of that business. But we also have customers in the graphics or processor area who are really pushing the envelope, and need to be in 16nm, 14nm, or 10nm. We work very closely with those guys to make sure they can go into FinFETs.

We always want to work with the customer to make sure they have a first-time silicon success. If you have to do a re-spin, you miss the opportunity and it’s very costly.

Q: There’s a new market that is starting to explodeIoT. How real is that world to you? Everyone talks about large numbers, but is it showing up in terms of tools?

Tan: Everybody is talking about huge profits, but a lot of the time I think it is just connecting old devices that you have. Billions of units, absolutely yes, but if you look close enough the silicon percentage of that revenue is very tiny. A lot of the profit is on the service side. So you really need to look at the service killer app you are trying to provide.

What’s most important to us in the IoT market is the IP business. That’s why we bought Tensilicait’s programmable, so you can find the killer app more quickly. The other challenges are time to market, low power, and low cost.

Q: Where is system design enablement going? Does it expand outside the traditional market for EDA?

Tan: It’s not just about tools. IP is now 11% of our revenue. At the PCB level, we acquired a company called Sigrity, and through that we are able to drive system analysis for power, signal integrity, and thermal. And then we look at some of the verticals and provide modeling all the way from the system level to the component level. We make sure that we provide a solution to the end customer, rather than something piecemeal.

Q: What do you think DAC will look like in five years?

Tan: It’s getting smaller. We need to see more startups and innovative IP solutions. I saw a few here this year, and that’s good. We need to encourage small startups.

Q: Where do we get the people to pull this off? I don’t see too many people coming into EDA.

Tan: I talk to a lot of university students, and I tell them that this small industry is a gold mine. A lot of innovation is needed. We need them to come in [to EDA] rather than join Google or Facebook. Those are great companies, but there is a lot of fundamental physical innovation we need.

Richard Goering

Related Blog Posts

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions

DAC 2015: Google Smart Contact Lens Project Stretches Limits of IC Design

Q&A with Nimish Modi: Going Beyond Traditional EDA




2015

DAC 2015 Accellera Panel: Why Standards are Needed for Internet of Things (IoT)

Design and verification standards are critical if we want to get a new generation of Internet of Things (IoT) devices into the market, according to panelists at an Accellera Systems Initiative breakfast at the Design Automation Conference (DAC 2015) June 9. However, IoT devices for different vertical markets pose very different challenges and requirements, making the standards picture extremely complicated.

The panel was titled “Design and Verification Standards in the Era of IoT.” It was moderated by industry editor John Blyler, CEO of JB Systems Media and Technology. Panelists were as follows, shown left to right in the photo below:

  • Lu Dai, director of engineering, Qualcomm
  • Wael William Diab, senior director for strategy marketing, industry development and standardization, Huawei
  • Chris Rowen, CTO, IP Group, Cadence Design Systems, Inc.

 

In opening remarks, Blyler recalled a conversation from the recent IEEE International Microwave Symposium in which a panelist pointed to the networking and application layers as the key problem areas for RF and wireless standardization. Similarly, in the IoT space, we need to look “higher up” at the systems level and consider both software and hardware development, Blyler said.

Rowen helped set some context for the discussion by noting three important points about IoT:

  • IoT is not a product segment. Vertical product segments such as automotive, medical devices, and home automation all have very different characteristics.
  • IoT “devices” are components within a hierarchy of systems that includes sensors, applications, user interface, gateway application (such as cell phone), and finally the cloud, where all data is aggregated.
  • A bifurcation is taking place in design. We are going from extreme scale SoCs to “extreme fit” SoCs that are specialized, low energy, and very low cost.

Here are some of the questions and answers that were addressed during the panel discussion.

Q: The claim was recently made that given the level of interaction between sensors and gateways, 50X more verification nodes would have to be checked for IoT. What standards need to be enhanced or changed to accomplish that?

Rowen: That’s a huge number of design dimensions, and the way you attack a problem of that scale is by modularization. You define areas that are protected and encapsulated by standards, and you prove that individual elements will be compliant with that interface. We will see that many interesting problems will be in the software layers.

Q: Why is standardization so important for IoT?

Dai: A company that is trying to make a lot of chips has to deal with a variety of standards. If you have to deal with hundreds of standards, it’s a big bottleneck for bringing your products to market. If you have good standardization within the development process of the IC, that helps time to market.

When I first joined Qualcomm a few years ago, there was no internal verification methodology. When we had a new hire, it took months to ramp up on our internal methodology to become effective. Then came UVM [Universal Verification Methodology], and as UVM became standard, we reduced our ramp-up time tremendously. We’ve seen good engineers ramp up within days.

Diab: When we start to look at standards, we have to do a better job of understanding how they’re all going to play with each other. I don’t think one set of standards can solve the IoT problem. Some standards can grow vertically in markets like industrial, and other standards are getting more horizontal. Security is very important and is probably one thing that goes horizontally.

Requirements for verticals may be different, but processing capability, latency, bandwidth, and messaging capability are common [horizontal] concerns. I think a lot of standards organizations this year will work on horizontal slices [of IoT].

Q: IoT interoperability is important. Any suggestions for getting that done and moving forward?

Rowen: The interoperability problem is that many of these [IoT] devices are wireless. Wireless is interesting because it is really hard – it’s not like a USB plug. Wireless lacks the infrastructure that exists today around wired standards. If we do things in a heavily wireless way, there will be major barriers to overcome.

Dai: There are different standards for 4G LTE technology for different [geographical] markets. We have to make a chip that can work for 20 or 30 wireless technologies, and the cost for that is tremendous. The U.S., Europe, and China all have different tweaks. A good standard that works across the globe would reduce the cost a lot.

Q: If we’re talking about the need to define requirements, a good example to look at is power. Certainly you have UPF [Unified Power Format] for the chip, board, and module.

Rowen: There is certainly a big role for standards about power management. But there is also a domain in which we’re woefully under-equipped, and that is the ability to accurately model the different power usage scenarios at the applications level. Too often power devolves into something that runs over thousands of cycles to confirm that you can switch between power management levels successfully. That’s important, but it tells you very little about how much power your system is going to dissipate.

Dai: There are products that claim to be UPF compliant, but my biggest problem with my most recent chip was still with UPF. These tools are not necessarily 100% UPF compliant.

One other concern I have is that I cannot get one simulator to pass my Verilog code and then go to another that will pass. Even though we have a lot of tools, there is no certification process for a language standard.

Q: When we create a standard, does there need to be a companion compliance test?

Rowen: I think compliance is important. Compliance is being able to prove that you followed what you said you would follow. It also plays into functional safety requirements, where you need to prove you adhered to the flow.

Dai: When we [Qualcomm] sell our 4G chips, we have to go through a lot of certifications. It’s often a differentiating factor.

Q: For IoT you need power management and verification that includes analog. Comments?

Rowen: Small, cheap sensor nodes tend to be very analog-rich, lower scale in terms of digital content, and have lots of software. Part of understanding what’s different about standardization is built on understanding what’s different about the design process, and what does it mean to have a software-rich and analog-rich world.

Dai: Analog is important in this era of IoT. Analog needs to come into the standards community.

Richard Goering

Cadence Blog Posts About DAC 2015

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions

DAC 2015: Google Smart Contact Lens Project Stretches Limits of IC Design

DAC 2015: Lip-Bu Tan, Cadence CEO, Sees Profound Changes in Semiconductors and EDA

DAC 2015: “Level of Compute in Vision Processing Extraordinary” – Chris Rowen

DAC 2015: Can We Build a Virtual Silicon Valley?

DAC 2015: Cadence Vision-Design Presentation Wins Best Paper Honors

 

 

 




2015

DAC 2015: How Academia and Industry Collaboration Can Revitalize EDA

Let’s face it – the EDA industry needs new people and new ideas. One of the best places to find both is academia, and a presentation at the Cadence Theater at the recent Design Automation Conference (DAC 2015) described collaboration models that are working today.

The presentation was titled “Industry/Academia Engagement Models – From PhD Contests to R&D Collaborations.” It included these speakers, shown from left to right in the photo below:

  • Prof. Xin Li, Electrical and Computer Engineering, Carnegie-Mellon University (CMU)
  • Chuck Alpert, Senior Software Architect, Cadence
  • Prof. Laleh Behjat, Department of Electrical and Computer Engineering, University of Calgary

 

Alpert, who was filling in for Zhuo Li, Software Architect at Cadence, was the vice chair of DAC 2015 and will be the general chair of DAC 2016 in Austin, Texas. “My team at Cadence really likes to collaborate with universities,” he said. “We’re a big proponent of education because we really need the best and brightest students in our industry.”

Contests Boost EDA Research

One way that Cadence collaborates with academia is participation in contests. “It’s a great way to formulate problems to academia,” Alpert said. “We can have the universities work on these problems and get some strategic direction.”

For example, Cadence has been involved with the annual CAD contest at the International Conference on Computer-Aided Design (ICCAD) since the contest was launched in 2012. This is the largest worldwide EDA R&D contest, and it is sponsored by the IEEE Council on EDA (CEDA) and the Taiwan Ministry of Education. Its goals are to boost EDA research in advanced real-world problems and to foster industry-academia collaboration.

Contestants can participate in one of more problems in the three areas of system design, logic synthesis and verification, and physical design. The 2015 contest has attracted 112 teams from 12 regions. Cadence contributes one problem per year in the logic synthesis area. Zhuo Li was the 2012 co-chair and the 2013 chair. The awards will be given at ICCAD in November 2015.

Another step that Cadence has taken, Alpert said, is to “hire lots of interns.” His own team has four interns at the moment. One advantage to interning at Cadence, he said, is that students get to see real-world designs and understand how the tools work. “It helps you drive your research in a more practical and useful direction,” he said.

The Cadence Academic Network co-sponsors the ACM SIGDA PhD Forum at DAC, and Xin Li and Zhuo Li are on the organizing committee. This event is a poster session for PhD students to present and discuss their dissertation research with people in the EDA community. This year’s forum was “packed,” Alpert said, and it’s clear that the event needs a bigger room.

Finally, Alpert noted, Cadence researchers write and publish technical papers at DAC and other conferences, and Cadence people serve on the DAC technical program committee. “We try to be involved with the academic community on a regular basis,” Alpert said. “We want the best and the brightest people to go into EDA because there is still so much innovation that’s needed. It’s a really cool place to be.”

Research Collaboration Exposes Failure Rates

Xin Li presented an example of a successful research collaboration between CMU and Cadence. The challenge was to find a better way to estimate potential failure rates in memory. As noted in a previous blog post, PhD student Shupeng Sun met this challenge with a new statistical methodology that won a Best Poster award at the ACM SIGDA PhD Forum at DAC 2014.

The new methodology is called Scaled-Sigma Sampling (SSS). It calculates the failure rate and accounts for variability in the manufacturing process while only requiring a few hundred, or a few thousand, sample circuit blocks. Previously, millions of samples were required for an accurate validation of a new design, and each sample could take minutes or hours to simulate. It could take a few weeks or months to run one validation.

The SSS methodology requires greatly reduced simulation times. It makes it possible, Li noted, to run simulations overnight and see the results in the morning.

Li shared his secret for success in collaborations. “I want to emphasize that before the collaboration, you have to understand the goal. If you don’t have a clear goal, don’t collaborate. Once you define the goal, stick to it and make it happen.”

Contest Provides Learning Experience

Last year Laleh Behjat handed two of her new PhD students a challenge. “I told them there is an ISPD [International Symposium for Physical Design] contest on placement, and I expect you to participate and I expect you to win. Not knowing anything about placement, I don’t think they realized what I was asking them.”

The 2015 contest was called the Blockage-Aware Detailed Routing-Driven Placement Contest. Results were announced at the end of March at ISPD. And the University of Calgary team, despite its lack of placement experience, took second place.

Such contests provide a good learning tool, according to Behjat. Graduate students in EDA, she said, “have to be good programmers. They have to work in teams and be collaborative, be able to innovate, and solve the hardest problems I have seen in engineering and science. And they have to think outside the box.” A contest can bring out all these attributes, she said.

Further, Behjat noted, contest participants had access to benchmarks and to a placement tool. They didn’t have to write tools to find out if their results were good. Industry sponsors, meanwhile, got access to good students and new approaches for solving problems.

“You can see Cadence putting a big amount of time, effort and money to get students here and get them excited about doing contests,” she said. She advised students in the theater audience to “talk to people in the Cadence booth and see if you can have more ideas for collaboration.”

Richard Goering

Related Blog Posts

EDA Plus Academia: A Perfect Game, Set and Match

Cadence Aims to Strengthen Academic Partnerships

BSIM-CMG FinFET Model – How Academia and Industry Empowered the Next Transistor




2015

DAC 2015: Jim Hogan Warns of “Looming Crisis” in Automotive Electronics

EDA investor and former executive Jim Hogan is optimistic about automotive electronics, but he has some concerns as well. At the recent Design Automation Conference (DAC 2015), he delivered a speech titled “The Looming Quality, Reliability, and Safety Crisis in Automotive Electronics...Why is it and what can we do to avoid it?"

Hogan gave the keynote speech for IP Talks!, a series of over 30 half-hour presentations located at the ChipEstimate.com booth. Presenters included ARM, Cadence, eSilicon, Kilopass, Sidense, SilabTech, Sonics, Synopsys, True Circuits, and TSMC. Held in an informal setting, the talks addressed the challenges faced by SoC design teams and showed how the latest developments in semiconductor IP can contribute to design success.

Jim Hogan delivers keynote speech at DAC 2015 IP Talks!

Hogan talked about several phases of automotive electronics. These include assisted driving to avoid collisions, controlled automation of isolated tasks such as parallel parking, and, finally, fully autonomous vehicles, which Hogan expects to see in 15 to 20 years. The top immediate priorities for automotive electronics designers, he said, will be government regulation, fuel economy, advanced safety, and infotainment.

More Code than a Boeing 777

According to Hogan, today’s automobiles use 50-100 microcontrollers per car, resulting in a worldwide automotive semiconductor market of around $40 billion. The global market for advanced automotive electronics is expected to reach $240 billion by 2020. Software is growing faster in the automotive market than it is in smartphones. Hogan quoted a Ford vice president who observed that there are more lines of code in a Ford Fusion car than a Boeing 777 airplane.

One unique challenge for automotive electronics designers is long-term reliability. This is because a typical U.S. car stays on the road for 15 years, Hogan said. Americans are holding onto new vehicles for a record 71.4 months.

Another challenge is regulatory compliance. Aeronautics is highly regulated from manufacturing to air traffic control, and the same will probably be true of automated cars. Hogan speculated that the Department of Transportation will be the regulatory authority for autonomous cars. Today, automotive electronics providers must comply with the ISO26262 automotive functional safety specification.

So where do we go from here? “We’ve got to change our mindset,” Hogan said. “We’ve got to focus on safety and reliability and demand a different kind of engineering discipline.” You can watch Hogan’s entire presentation by clicking on the video icon below, or clicking here. You can also watch other IP Talks! videos from DAC 2015 here.

https://youtu.be/qL4kAEu-PNw

 

Richard Goering

Related Blog Posts

DAC 2015: See the Latest in Semiconductor IP at “IP Talks!”

Automotive Functional Safety Drives New Chapter in IC Verification




2015

Five Reasons I'm Excited About Mixed-Signal Verification in 2015

Key Findings: Many more design teams will be reaching the mixed-signal methodology tipping point in 2015. That means you need to have a (verification) plan, and measure and execute against it.

As 2014 draws to a close, it is time to look ahead to the coming years and make a plan. While the macro view of the chip design world shows that is has been a mixed-signal world for a long time, it is has been primarily the digital teams that have rapidly evolved design and verification practices over the past decade. Well, I claim that is about to change. 2015 will be a watershed year for many more design teams because of the following factors:

  • 85% of designs are mixed signal, and it is going to stay that way (there is no turning back)
  • Advanced node drives new techniques, but they will be applied on all nodes
  • Equilibrium of mixed-signal designs being challenged, complexity raises risk level
  • Tipping point signs are evident and pervasive, things are going to change
  • The convergence of “big A” and “big D” demands true mixed-signal practices

Reason 1: Mixed-signal is dominant

To begin the examination of what is going to change and why, let’s start with what is not changing. IBS reports that mixed signal accounts for over 85% of chip design starts in 2014, and that percentage will rise, and hold steady at 85% in the coming years. It is a mixed-signal world and there is no turning back!

 

Figure 1. IBS: Mixed-signal design starts as percent of total

The foundational nature of mixed-signal designs in the semiconductor industry is well established. The reason it is exciting is that a stable foundation provides a platform for driving change. (It’s hard to drive on crumbling infrastructure.  If you’re from California, you know what I mean, between the potholes on the highways and the earthquakes and everything.)

Reason 2: Innovation in many directions, mostly mixed-signal applications

While the challenges being felt at the advanced nodes, such as double patterning and adoption of FinFET devices, have slowed some from following onto to nodes past 28nm, innovation has just turned in different directions. Applications for Internet of Things, automotive, and medical all have strong mixed-signal elements in their semiconductor content value proposition. What is critical to recognize is that many of the design techniques that were initially driven by advanced-node programs have merit across the spectrum of active semiconductor process technologies. For example, digitally controlled, calibrated, and compensated analog IP, along with power-reducing mutli-supply domains, power shut-off, and state retention are being applied in many programs on “legacy” nodes.

Another graph from IBS shows that the design starts at 45nm and below will continue to grow at a healthy pace.  The data also shows that nodes from 65nm and larger will continue to comprise a strong majority of the overall starts. 


Figure 2.  IBS: Design starts per process node

TSMC made a comprehensive announcement in September related to “wearables” and the Internet of Things. From their press release:

TSMC’s ultra-low power process lineup expands from the existing 0.18-micron extremely low leakage (0.18eLL) and 90-nanometer ultra low leakage (90uLL) nodes, and 16-nanometer FinFET technology, to new offerings of 55-nanometer ultra-low power (55ULP), 40ULP and 28ULP, which support processing speeds of up to 1.2GHz. The wide spectrum of ultra-low power processes from 0.18-micron to 16-nanometer FinFET is ideally suited for a variety of smart and power-efficient applications in the IoT and wearable device markets. Radio frequency and embedded Flash memory capabilities are also available in 0.18um to 40nm ultra-low power technologies, enabling system level integration for smaller form factors as well as facilitating wireless connections among IoT products.

Compared with their previous low-power generations, TSMC’s ultra-low power processes can further reduce operating voltages by 20% to 30% to lower both active power and standby power consumption and enable significant increases in battery life—by 2X to 10X—when much smaller batteries are demanded in IoT/wearable applications.

The focus on power is quite evident and this means that all of the power management and reduction techniques used in advanced node designs will be coming to legacy nodes soon.

Integration and miniaturization are being pursued from the system-level in, as well as from the process side. Techniques for power reduction and system energy efficiency are central to innovations under way.  For mixed-signal program teams, this means there is an added dimension of complexity in the verification task. If this dimension is not methodologically addressed, the level of risk adds a new dimension as well.

Reason 3: Trends are pushing the limits of established design practices

Risk is the bane of every engineer, but without risk there is no progress. And, sometimes the amount of risk is not something that can be controlled. Figure 3 shows some of the forces at work that cause design teams to undertake more risk than they would ideally like. With price and form factor as primary value elements in many growing markets, integration of analog front-end (AFE) with digital processing is becoming commonplace.  

 

Figure 3.  Trends pushing mixed-signal out of equilibrium

The move to the sweet spot of manufacturing at 28nm enables more integration, while providing excellent power and performance parameters with the best cost per transistor. Variation becomes great and harder to control. For analog design, this means more digital assistance for calibration and compensation. For greatest flexibility and resiliency, many will opt for embedding a microcontroller to perform the analog control functions in software. Finally, the first wave of leaders have already crossed the methodology bridge into true mixed-signal design and verification; those who do not follow are destined to fall farther behind.

Reason 4: The tipping point accelerants are catching fire

The factors cited in Reason 3 all have a technical grounding that serves to create pain in the chip-development process. The more factors that are present, the harder it is to ignore the pain and get the treatment relief  afforded by adopting known best practices for truly mixed-signal design (versus divide and conquer along analog and digital lines design).

In the past design performance was measured in MHz with simple static timing and power analysis. Design flows were conveniently partitioned, literally and figuratively, along analog and digital boundaries. Today, however, there are gigahertz digital signals that interact at the package and board level in analog-like ways. New, dynamic power analysis methods enabled by advanced library characterization must be melded into new design flows. These flows comprehend the growing amount of feedback between analog and digital functions that are becoming so interlocked as to be inseparable. This interlock necessitates design flows that include metrics-driven and software-driven testbenches, cross fabric analysis, electrically aware design, and database interoperability across analog and digital design environments.


Figure 4.  Tipping point indicators

Energy efficiency is a universal driver at this point.  Be it cost of ownership in the data center or battery life in a cell phone or wearable device, using less power creates more value in end products. However, layering multiple energy management and optimization techniques on top of complex mixed-signal designs adds yet more complexity demanding adoption of “modern” mixed-signal design practices.

Reason 5: Convergence of analog and digital design

Divide and conquer is always a powerful tool for complexity management.  However, as the number of interactions across the divide increase, the sub-optimality of those frontiers becomes more evident. Convergence is the name of the game.  Just as analog and digital elements of chips are converging, so will the industry practices associated with dealing with the converged world.


Figure 5. Convergence drivers

Truly mixed-signal design is a discipline that unites the analog and digital domains. That means that there is a common/shared data set (versus forcing a single cockpit or user model on everyone). 

In verification the modern saying is “start with the end in mind”. That means creating a formal approach to the plan of what will be test, how it will be tested, and metrics for success of the tests. Organizing the mechanics of testbench development using the Unified Verification Methodology (UVM) has proven benefits. The mixed-signal elements of SoC verification are not exempted from those benefits.

Competition is growing more fierce in the world for semiconductor design teams. Not being equipped with the best-known practices creates a competitive deficit that is hard to overcome with just hard work. As the landscape of IC content drives to a more energy-efficient mixed-signal nature, the mounting risk posed by old methodologies may cause causalities in the coming year. Better to move forward with haste and create a position of strength from which differentiation and excellence in execution can be forged.

Summary

2015 is going to be a banner year for mixed-signal design and verification methodologies. Those that have forged ahead are in a position of execution advantage. Those that have not will be scrambling to catch up, but with the benefits of following a path that has been proven by many market leaders.



  • uvm
  • mixed signal design
  • Metric-Driven-Verification
  • Mixed Signal Verification
  • MDV-UVM-MS


2015

Mandriva Linux Security Advisory 2015-208

Mandriva Linux Security Advisory 2015-208 - An issue has been identified in Mandriva Business Server 2's setup package where the /etc/shadow and /etc/gshadow files containing password hashes were created with incorrect permissions, making them world-readable. This update fixes this issue by enforcing that those files are owned by the root user and shadow group, and are only readable by those two entities. Note that this issue only affected new Mandriva Business Server 2 installations. Systems that were updated from previous Mandriva versions were not affected. This update was already issued as MDVSA-2015:184, but the latter was withdrawn as it generated.rpmnew files for critical configuration files, and rpmdrake might propose the user to use those basically empty files, thus leading to loss of passwords or partition table. This new update ensures that such.rpmnew files are not kept after the update.




2015

Mandriva Linux Security Advisory 2015-209

Mandriva Linux Security Advisory 2015-209 - Update PHP packages address buffer over-read and overflow vulnerabilities. PHP has been updated to version 5.5.24, which fixes these issues and other bugs. Additionally the timezonedb packages has been upgraded to the latest version and the PECL packages which requires so has been rebuilt for php-5.5.24.




2015

Mandriva Linux Security Advisory 2015-210

Mandriva Linux Security Advisory 2015-210 - A denial of service flaw was found in the way QEMU handled malformed Physical Region Descriptor Table data sent to the host's IDE and/or AHCI controller emulation. A privileged guest user could use this flaw to crash the system. It was found that the QEMU's websocket frame decoder processed incoming frames without limiting resources used to process the header and the payload. An attacker able to access a guest's VNC console could use this flaw to trigger a denial of service on the host by exhausting all available memory and CPU.




2015

Mandriva Linux Security Advisory 2015-211

Mandriva Linux Security Advisory 2015-211 - glusterfs was vulnerable to a fragment header infinite loop denial of service attack. Also, the glusterfsd SysV init script was failing to properly start the service. This was fixed by replacing it with systemd unit files for the service that work properly.




2015

Mandriva Linux Security Advisory 2015-212

Mandriva Linux Security Advisory 2015-212 - An off-by-one flaw, leading to a buffer overflow, was found in the font parsing code in the 2D component in OpenJDK. A specially crafted font file could possibly cause the Java Virtual Machine to execute arbitrary code, allowing an untrusted Java application or applet to bypass Java sandbox restrictions. A flaw was found in the way the Hotspot component in OpenJDK handled phantom references. An untrusted Java application or applet could use this flaw to corrupt the Java Virtual Machine memory and, possibly, execute arbitrary code, bypassing Java sandbox restrictions. A flaw was found in the way the JSSE component in OpenJDK parsed X.509 certificate options. A specially crafted certificate could cause JSSE to raise an exception, possibly causing an application using JSSE to exit unexpectedly. A flaw was discovered in the Beans component in OpenJDK. An untrusted Java application or applet could use this flaw to bypass certain Java sandbox restrictions. A directory traversal flaw was found in the way the jar tool extracted JAR archive files. A specially crafted JAR archive could cause jar to overwrite arbitrary files writable by the user running jar when the archive was extracted. It was found that the RSA implementation in the JCE component in OpenJDK did not follow recommended practices for implementing RSA signatures.




2015

Mandriva Linux Security Advisory 2015-213

Mandriva Linux Security Advisory 2015-213 - lftp incorrectly validates wildcard SSL certificates containing literal IP addresses, so under certain conditions, it would allow and use a wildcard match specified in the CN field, allowing a malicious server to participate in a MITM attack or just fool users into believing that it is a legitimate site. lftp was affected by this issue as it uses code from cURL for checking SSL certificates. The curl package was fixed in MDVSA-2015:098.




2015

Mandriva Linux Security Advisory 2015-214

Mandriva Linux Security Advisory 2015-214 - The libksba package has been updated to version 1.3.3, which fixes an integer overflow in the DN decoder and a couple of other minor bugs.




2015

Mandriva Linux Security Advisory 2015-215

Mandriva Linux Security Advisory 2015-215 - The t1utils package has been updated to version 1.39, which fixes a buffer overrun, infinite loop, and stack overflow in t1disasm.




2015

Mandriva Linux Security Advisory 2015-216

Mandriva Linux Security Advisory 2015-216 - Lack of filtering in the title parameter of links to rrdPlugin allowed cross-site-scripting attacks against users of the web interface.




2015

Mandriva Linux Security Advisory 2015-217

Mandriva Linux Security Advisory 2015-217 - SQLite before 3.8.9 does not properly implement the dequoting of collation-sequence names, which allows context-dependent attackers to cause a denial of service (uninitialized memory access and application crash) or possibly have unspecified other impact via a crafted COLLATE clause, as demonstrated by COLLATE at the end of a SELECT statement. The sqlite3VdbeExec function in vdbe.c in SQLite before 3.8.9 does not properly implement comparison operators, which allows context-dependent attackers to cause a denial of service (invalid free operation) or possibly have unspecified other impact via a crafted CHECK clause, as demonstrated by CHECK in a CREATE TABLE statement. The sqlite3VXPrintf function in printf.c in SQLite before 3.8.9 does not properly handle precision and width values during floating-point conversions, which allows context-dependent attackers to cause a denial of service or possibly have unspecified other impact via large integers in a crafted printf function call in a SELECT statement. The updated packages provides a solution for these security issues.




2015

Mandriva Linux Security Advisory 2015-218

Mandriva Linux Security Advisory 2015-218 - Multiple vulnerabilities have been found and corrected in glibc. It was discovered that, under certain circumstances, glibc's getaddrinfo() function would send DNS queries to random file descriptors. An attacker could potentially use this flaw to send DNS queries to unintended recipients, resulting in information disclosure or data loss due to the application encountering corrupted data. Various other issues were also addressed. The updated packages provides a solution for these security issues.




2015

Mandriva Linux Security Advisory 2015-220

Mandriva Linux Security Advisory 2015-220 - NTLM-authenticated connections could be wrongly reused for requests without any credentials set, leading to HTTP requests being sent over the connection authenticated as a different user. When doing HTTP requests using the Negotiate authentication method along with NTLM, the connection used would not be marked as authenticated, making it possible to reuse it and send requests for one user over the connection authenticated as a different user.




2015

Mandriva Linux Security Advisory 2015-219

Mandriva Linux Security Advisory 2015-219 - NTLM-authenticated connections could be wrongly reused for requests without any credentials set, leading to HTTP requests being sent over the connection authenticated as a different user. When parsing HTTP cookies, if the parsed cookie's path element consists of a single double-quote, libcurl would try to write to an invalid heap memory address. This could allow remote attackers to cause a denial of service. When doing HTTP requests using the Negotiate authentication method along with NTLM, the connection used would not be marked as authenticated, making it possible to reuse it and send requests for one user over the connection authenticated as a different user.




2015

Mandriva Linux Security Advisory 2015-221

Mandriva Linux Security Advisory 2015-221 - Multiple vulnerabilities have been found and corrected in clamav. The updated packages provides a solution for these security issues.




2015

Mandriva Linux Security Advisory 2015-222

Mandriva Linux Security Advisory 2015-222 - Emanuele Rocca discovered that ppp was subject to a buffer overflow when communicating with a RADIUS server. This would allow unauthenticated users to cause a denial-of-service by crashing the daemon.




2015

Mandriva Linux Security Advisory 2015-224

Mandriva Linux Security Advisory 2015-224 - Ruby OpenSSL hostname matching implementation violates RFC 6125. The ruby packages for MBS2 has been updated to version 2.0.0-p645, which fixes this issue.




2015

Mandriva Linux Security Advisory 2015-225

Mandriva Linux Security Advisory 2015-225 - The cherokee_validator_ldap_check function in validator_ldap.c in Cherokee 1.2.103 and earlier, when LDAP is used, does not properly consider unauthenticated-bind semantics, which allows remote attackers to bypass authentication via an empty password.




2015

Mandriva Linux Security Advisory 2015-226

Mandriva Linux Security Advisory 2015-226 - FCGI does not perform range checks for file descriptors before use of the FD_SET macro. This FD_SET macro could allow for more than 1024 total file descriptors to be monitored in the closing state. This may allow remote attackers to cause a denial of service (stack memory corruption, and infinite loop or daemon crash) by opening many socket connections to the host and crashing the service.




2015

Mandriva Linux Security Advisory 2015-223

Mandriva Linux Security Advisory 2015-223 - Multiple integer signedness errors in the Dispatch_Write function in proxy/dispatcher/idirectfbsurface_dispatcher.c in DirectFB allow remote attackers to cause a denial of service and possibly execute arbitrary code via the Voodoo interface, which triggers a stack-based buffer overflow. The Dispatch_Write function in proxy/dispatcher/idirectfbsurface_dispatcher.c in DirectFB allows remote attackers to cause a denial of service and possibly execute arbitrary code via the Voodoo interface, which triggers an out-of-bounds write.




2015

Mandriva Linux Security Advisory 2015-227

Mandriva Linux Security Advisory 2015-227 - This update provides MariaDB 5.5.43, which fixes several security issues and other bugs.




2015

Mandriva Linux Security Advisory 2015-228

Mandriva Linux Security Advisory 2015-228 - It was found that libuv does not call setgoups before calling setuid/setgid. This may potentially allow an attacker to gain elevated privileges. The libuv library is bundled with nodejs, and a fixed version of libuv is included with nodejs as of version 0.10.37. The nodejs package has been updated to version 0.10.38 to fix this issue, as well as several other bugs.




2015

Mandriva Linux Security Advisory 2015-229

Mandriva Linux Security Advisory 2015-229 - It was discovered that the snmp_pdu_parse() function could leave incompletely parsed varBind variables in the list of variables. A remote, unauthenticated attacker could exploit this flaw to cause a crash or, potentially, execute arbitrary code.




2015

Mandriva Linux Security Advisory 2015-230

Mandriva Linux Security Advisory 2015-230 - Squid configured with client-first SSL-bump does not correctly validate X509 server certificate domain / hostname fields.




2015

Mandriva Linux Security Advisory 2015-231

Mandriva Linux Security Advisory 2015-231 - Tilmann Haak from xing.com discovered that XML::LibXML did not respect the expand_entities parameter to disable processing of external entities in some circumstances. This may allow attackers to gain read access to otherwise protected resources, depending on how the library is used.




2015

Mandriva Linux Security Advisory 2015-232

Mandriva Linux Security Advisory 2015-232 - A malformed certificate input could cause a heap overflow read in the DER decoding functions of Libtasn1. The heap overflow happens in the function _asn1_extract_der_octet().




2015

Trump Said to Mull 2015 Grid Emergency Law to Save Coal Plants

The Trump administration is weighing a broad array of strategies for keeping coal and nuclear power plants online as a matter of national security, with options ranging from invoking a 68-year-old law to a three-year-old one, according to a senior Energy Department official.




2015

OEE 2015 builds on 2014 market study to commercialize MHK energy sectors

During the Ocean Energy Europe 2015 conference held in Dublin, Ireland, in late October, about 500 high-level delegates, which included European Union and Member States business leaders and energy ministers, were presented a draft of the “strategic roadmap” for developing the European marine hydrokinetics (MHK) energy sector.
 




2015

Ten Clean Energy Stocks For 2015

2015 marks my seventh annual list of ten clean energy stocks. An equal weighted portfolio of the ten stocks in each year's list has outperformed my industry benchmark every year except 2013. 2014 was no exception, but it was a bittersweet victory in that the model portfolio was slightly down while the benchmark lost considerably more in a very challenging year for clean energy stocks.




2015

2015: The Clean Economy’s Watershed Year?

In a crammed Washington conference room last week, speaker after speaker seemed to apologize for their ‘broken record’ talking points as Bloomberg New Energy Finance and the Business Council for Sustainable Energy unveiled their annual Factbook. But, of course, they were only being honest — like 2013 before it, 2014 had been an unprecedented year for clean energy.




2015

Renewable Energy Roundtable: Production and Investment Tax Policy to be a Top Priority in 2015

The renewable energy industry has come a long way in relatively little time. The costs of renewable technologies continue to go down, while renewable capacities at many utilities continue to go up. Although, in many cases, renewable technology is mature and ready for utility-scale deployment, state and federal production and investment tax policies appear less evolved.




2015

The Big Question: Where Do You See Renewable Energy Growth Potential in 2015?

The annual outlook issue of Renewable Energy World magazine is our attempt to predict what will happen within the renewable energy industry over the course of the year. To do this, we went straight to the top of major renewable energy companies, asking CEOs and presidents to tell us where they are devoting their company resources in order to capitalize on some of the market growth that they expect to see in 2015.




2015

Ten Clean Energy Stocks For 2015: Marching Ahead

My Ten Clean Energy Stocks for 2015 model portfolio added a second month to its winning streak, with a 6.1 percent gain for the month and a 5.7 percent gain for the year, despite a continued drag by the strong dollar. If measured in terms of the companies' local currencies, the portfolio would have been up 7.5 percent for the month and 10.5 percent for the quarter or year to date. For comparison, the broad universe of US small cap stocks rose 1.5 percent for the month and 4.0 percent for quarter, as measured by IWM, the Russell 2000 index ETF.




2015

Green Mutual Funds and ETFs May Recover in 2015

Alternative energy mutual funds are continuing to recover from a slump which started in fall 2014. Annual returns range greatly, though, from a high of 15.6 percent for Brown Advisory Sustainable Growth (BIAWX), to a low of -15.8 percent for Guinness Atkinson Alternative Energy (GAAEX). The large 12-month drop by GAAEX was precipitated by painful losses in some of its top weighted holdings.