Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

What is the future of Boolean algebra-based languages and methodologies?

Status
Not open for further replies.

tahirsengine

Member level 3
Joined
May 7, 2007
Messages
66
Helped
3
Reputation
6
Reaction score
3
Trophy points
1,288
Location
Germany
Activity points
1,929
I am thinking(in fact worried about) a topic that I have no answer for. That's why, I am posting here, and inviting international intelligentsia to come and share views.

First my background: I am an ASIC designer who has just started my career in ASIC design industry. I am doing codes in Verilog and trying to learn to verify them using Systemverilog. I have a good background in FPGA based system designing using both Verilog and VHDL. Along the path, I am also considering to learn Microcontrollers, just to be able to do SoC based projects. And that's it.

So the point is: All languages right from C++ and Verilog were designed based on an idea of Binary or Boolean algebra. All microprocessors and digital systems work based on this idea.

Soon, quantum mechanics based processors will hit the market, where bits will be replaced by Qubit and Miocrocontrollers will be replaced by some other types of processors. Chip design methodologies will be changed as well as the verification mythologies and software. Surely, there will be no shortage of engineers for new technologies as well. But the guys like me, who are working with old technologies, what may be good way out? I mean, to handle this coming revolution, how can we guys learn new technologies(As now, either I will earn or will go to the University, which is impossible for me now). And there are many thousands like me out there.

so the question is: Will we become useless like many thousands analog guys before or there is some good solution for us as well.

Please share your views.
 

analog is not useless
really understanding how digital works requires understanding analog, using digital does not

quantum computing is still a long way off and it may never be in consumer or industrial products
it may be a specialty item, like Cray computers and the like

when quantum computing comes along, you'll learn that skill, just as every engineer before you has learned new skills

logic is still logic, bit or Qubit and both are ultimately two level, 0 and 1, on or off, etc

i started with Fortran, then turbo pascal, C and C++, and now python
i started with TTL logic, then programmed the first PLDs from Monolithic Memories and now PSoCs.

Nothing trumps experience
 
I started in VHDL and Verilog nearly 15 years ago. About 10 years ago there was a lot of talk of tools like HDL Coder from Mathworks, Handel C from celoxica making HDL coders redundant because software or algorithms guys could come along and replace us.
10 years later and we're still in demand and C to gates hasnt happened yet.
As for Quantum - I doubt its going to make much of an impact for at least 10-20 years after first arriving.
Nothing changes at any fast pace.

I wouldnt worry.
 
C to gates has happened its just not good yet. I prototyped a PID controller in C and it was super quick to get the basic algorithm running. The problem is that you're still on the hook for handling overflow and other various nitty gritty. At which point the C would have balooned into something similar to the verilog.


I could see the benefits however. When I coded the block I registered each step just out of habit and out of the expectation that pipelining would ultimately be necessary. The C compiler however analyzed the timing at compile time and placed pipelining only where needed giving a lower latency result.


I do see software guys starting to use C combined with the high level graphical editors to drop down IP blocks and form signal processing chains.
 

Sorry, I really meant to say that C to gates hasnt really taken over. All it really does is open up the market for Intel/Xilinx, especially now they could be losing market to GPUs. If you think about it in a sceptical maner, there isnt really an incentice for the chip makers to optimise C to gates initially, as less optimal designs will hopefully mean the sale of the bigger and more expensive chips. But it will get optimised when they can see a market for it.

I see Xilinx are pushing a new C to Gates tool - with all the usual "You wont need HDL anymore" blurb. Reality never really aligns with the expectations. Make sure your CTO is evaluating the licence purchase rather than the CO and PMs, otherwise you'll be forced to use a "magical" tool based on the sales pitch.
 

Make sure your CTO is evaluating the licence purchase rather than the CO and PMs, otherwise you'll be forced to use a "magical" tool based on the sales pitch.

Out of everything one takes from this thread...
This is the single most important gem in the thread. If the decision for tools is made by the "business" side of management you will be forever stuck chasing the next dream tool that will make everything happen in half the time.

All this successfully does is make the vendors sales/marketing happy (with their new BMWs) and makes the costs for a project go up, the productivity go down, and the engineers stuck using the tool that makes them look bad because they aren't meeting the design's performance goals and then have to work long weekends to write and verify RTL in those performance critical paths.

It all boils down to that trifecta of performance, cost, and time. You can only have two out of the three. Sales people always manage to convince those CEOs and PM that you can get all three with their new tools.

- - - Updated - - -

The future is here **broken link removed** The programming level is hidden and their service is to offer higher level language interfaces

And how does this relate at all to creating an FPGA design? Does Rigetti have a way to generate programming bit files for Intel, Xilinx, Microsemi, Lattice, etc?
 

I started in VHDL and Verilog nearly 15 years ago. About 10 years ago there was a lot of talk of tools like HDL Coder from Mathworks, Handel C from celoxica making HDL coders redundant because software or algorithms guys could come along and replace us.
10 years later and we're still in demand and C to gates hasnt happened yet.
As for Quantum - I doubt its going to make much of an impact for at least 10-20 years after first arriving.
Nothing changes at any fast pace.

I wouldnt worry.

Actually the thing is, I was more alarmed by the Intel's strategy. They are not progressing with their traditional processors very fast(I think due to frequency + multicore bottle neck). But if they are not putting too much energies on traditional processors(I know they bought Altera recently, but again, the progress on that side is also slow, AMD made a good progress in this area, seems like), so where they are putting their efforts?
I mean, yes, FPGAs will remain good for long time, but seems like contemporary ASIC world and Microprocessors time is left very short (PCs and clusters). Intel might jump into the market with something very different(and to further tighten their grip on market).
And that gonna change a lot, if not all.
Your thoughts please.
 

Intel are struggling with their 14 and 10nm chip fab. They have stuff to market now they were trailing 4 years ago and its about 2-3 years later than planned. They are going after the server market.
They have a new bus to connect CPU and FPGA because they are now both in the same package but this limits the BW. They ultimately want them both on the same die to allow even greater bandwidth (but there are delays).
 
Intel are struggling with their 14 and 10nm chip fab. They have stuff to market now they were trailing 4 years ago and its about 2-3 years later than planned. They are going after the server market.
They have a new bus to connect CPU and FPGA because they are now both in the same package but this limits the BW. They ultimately want them both on the same die to allow even greater bandwidth (but there are delays).

Are they planning something different than FPGA and Microprocessors or not? Any remote chance or news?
 

Are they planning something different than FPGA and Microprocessors or not? Any remote chance or news?

Possibly, but nothing that is public afaik.
Remember any radical change is not going to be bought, and there wont be any experience with it. Its a chicken and egg problem.
 
Honestly, I really think may of us will be dead when (and if) quantum computing be a thing.

There are many technologies that are very promising but sometimes are only this, promising. See memresistors, cold fusion, grafen.

When such technologies became viable I will worry about them. Until them, they are only possibilities.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top